Skip to content

Properly check for equivalent return types when determining whether overload resolution is ambiguous#2685

Open
rchen152 wants to merge 3 commits intofacebook:mainfrom
rchen152:export-D95512431
Open

Properly check for equivalent return types when determining whether overload resolution is ambiguous#2685
rchen152 wants to merge 3 commits intofacebook:mainfrom
rchen152:export-D95512431

Conversation

@rchen152
Copy link
Contributor

@rchen152 rchen152 commented Mar 6, 2026

Summary: Fixes #2552.

Differential Revision: D95512431

@meta-cla meta-cla bot added the cla signed label Mar 6, 2026
@meta-codesync
Copy link

meta-codesync bot commented Mar 6, 2026

@rchen152 has exported this pull request. If you are a Meta employee, you can view the originating Diff in D95512431.

@github-actions

This comment has been minimized.

@github-actions

This comment has been minimized.

rchen152 added a commit to rchen152/pyrefly that referenced this pull request Mar 6, 2026
…verload resolution is ambiguous (facebook#2685)

Summary:

Fixes facebook#2552.

Differential Revision: D95512431
@github-actions

This comment has been minimized.

@github-actions

This comment has been minimized.

rchen152 added a commit to rchen152/pyrefly that referenced this pull request Mar 6, 2026
…verload resolution is ambiguous (facebook#2685)

Summary:

Fixes facebook#2552.

Differential Revision: D95512431
@github-actions

This comment has been minimized.

@github-actions

This comment has been minimized.

rchen152 added a commit to rchen152/pyrefly that referenced this pull request Mar 7, 2026
…verload resolution is ambiguous (facebook#2685)

Summary:

Fixes facebook#2552.

Differential Revision: D95512431
@github-actions

This comment has been minimized.

@github-actions

This comment has been minimized.

rchen152 added a commit to rchen152/pyrefly that referenced this pull request Mar 10, 2026
…verload resolution is ambiguous (facebook#2685)

Summary:

Fixes facebook#2552.

Differential Revision: D95512431
Differential Revision: D95982216
Summary: Previously, we tried to use the hint during overload selection but would redo calls without the hint if we encountered errors. This diff reworks `call_overloads` to do overload selection without using the hint at all. The hint is used at the very end to produce a better return type after an overload has been selected. This is easier to follow and does less unnecessary work.

Differential Revision: D95908767
rchen152 added a commit to rchen152/pyrefly that referenced this pull request Mar 10, 2026
…verload resolution is ambiguous (facebook#2685)

Summary:
This diff fixes a user-reported issue in Pyrefly's overload resolution, where we accidentally considered non-equivalent types equivalent and therefore used the return type of the first matched overload rather than `Any` for ambiguous calls.

I don't think this diff is land-able on its own, due to the number of assert_type tests in numpy and scipy-stubs it breaks. I've put it up mostly so that we can see the difference in behavior between this diff and the next one, which proposes an alternate way of resolving ambiguous overload calls.


Fixes facebook#2552.

Differential Revision: D95512431
rchen152 added a commit to rchen152/pyrefly that referenced this pull request Mar 11, 2026
…verload resolution is ambiguous (facebook#2685)

Summary:
This diff fixes a user-reported issue in Pyrefly's overload resolution, where we accidentally considered non-equivalent types equivalent and therefore used the return type of the first matched overload rather than `Any` for ambiguous calls.

I'm hesitant to land this diff on its own, due to the number of assert_type tests in numpy and scipy-stubs it breaks. The next diff contains a proposal for a (non-spec-compliant) alternate way of resolving ambiguous calls that reduces the number of assert_type failures.


Fixes facebook#2552.

Differential Revision: D95512431
…verload resolution is ambiguous (facebook#2685)

Summary:
This diff fixes a user-reported issue in Pyrefly's overload resolution, where we accidentally considered non-equivalent types equivalent and therefore used the return type of the first matched overload rather than `Any` for ambiguous calls.

I'm hesitant to land this diff on its own, due to the number of assert_type tests in numpy and scipy-stubs it breaks. The next diff contains a proposal for a (non-spec-compliant) alternate way of resolving ambiguous calls that reduces the number of assert_type failures.

Pull Request resolved: facebook#2685

Fixes facebook#2552.

Differential Revision: D95512431
@github-actions
Copy link

Diff from mypy_primer, showing the effect of this PR on open source code:

spack (https://github.com/spack/spack)
- ERROR lib/spack/spack/build_environment.py:613:31-44: Argument `Unknown | None` is not assignable to parameter `name` with type `Path | str` in function `spack.util.executable.Executable.__init__` [bad-argument-type]
- ERROR lib/spack/spack/filesystem_view.py:580:28-584:10: No matching overload found for function `posixpath.join` called with arguments: (str, str, Any | None) [no-matching-overload]
- ERROR lib/spack/spack/main.py:323:9-23: Class member `SpackArgumentParser.add_subparsers` overrides parent class `ArgumentParser` in an inconsistent manner [bad-override]
- ERROR lib/spack/spack/mixins.py:54:29-39: Object of class `NoneType` has no attribute `prefix` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:54:78-88: Object of class `NoneType` has no attribute `prefix` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:63:19-27: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:64:41-49: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:66:21-29: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:67:42-50: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:69:25-33: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:70:41-49: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:71:42-50: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:104:37-45: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:106:28-36: Object of class `NoneType` has no attribute `spec` [missing-attribute]
- ERROR lib/spack/spack/mixins.py:117:12-24: Object of class `NoneType` has no attribute `compiler` [missing-attribute]
- ERROR lib/spack/spack/test/relocate.py:39:26-32: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR lib/spack/spack/util/ctest_log_parser.py:351:17-34: `str | Any` is not assignable to attribute `source_file` with type `tuple[Unknown | None]` [bad-assignment]
- ERROR lib/spack/spack/util/ctest_log_parser.py:351:36-56: `str | Any` is not assignable to attribute `source_line_no` with type `tuple[Unknown | None]` [bad-assignment]

parso (https://github.com/davidhalter/parso)
- ERROR parso/python/prefix.py:84:19-30: Object of class `NoneType` has no attribute `group` [missing-attribute]
- ERROR parso/python/prefix.py:85:17-28: Object of class `NoneType` has no attribute `group` [missing-attribute]
- ERROR parso/python/prefix.py:96:17-26: Object of class `NoneType` has no attribute `end` [missing-attribute]
+ ERROR parso/python/tokenize.py:588:68-76: Index `1` out of range for string with 1 elements [bad-index]

kornia (https://github.com/kornia/kornia)
+ ERROR kornia/augmentation/presets/ada.py:200:25-60: No matching overload found for function `max` called with arguments: (Literal[0], float) [no-matching-overload]

beartype (https://github.com/beartype/beartype)
- ERROR beartype/_check/convert/_reduce/_pep/redpep484612646.py:323:13-25: Argument `HintPep646692UnpackedType | Iota | TypeVar | Unknown | None` is not assignable to parameter `hint` with type `TypeVar` in function `beartype._util.hint.pep.proposal.pep484.pep484typevar.get_hint_pep484_typevar_bounded_constraints_or_none` [bad-argument-type]
+ ERROR beartype/_check/convert/_reduce/_pep/redpep484612646.py:323:13-25: Argument `HintPep646692UnpackedType | Iota | TypeVar | Unknown` is not assignable to parameter `hint` with type `TypeVar` in function `beartype._util.hint.pep.proposal.pep484.pep484typevar.get_hint_pep484_typevar_bounded_constraints_or_none` [bad-argument-type]

dd-trace-py (https://github.com/DataDog/dd-trace-py)
- ERROR ddtrace/appsec/_utils.py:187:16-43: Returned type `Any | None` is not assignable to declared return type `int | str` [bad-return]
- ERROR ddtrace/contrib/internal/unittest/patch.py:650:25-37: Object of class `NoneType` has no attribute `trace` [missing-attribute]
- ERROR ddtrace/contrib/internal/unittest/patch.py:692:24-42: Object of class `NoneType` has no attribute `_start_span` [missing-attribute]
- ERROR ddtrace/contrib/internal/unittest/patch.py:736:23-41: Object of class `NoneType` has no attribute `_start_span` [missing-attribute]
- ERROR ddtrace/contrib/internal/unittest/patch.py:772:12-30: Object of class `NoneType` has no attribute `_start_span` [missing-attribute]
- ERROR ddtrace/internal/ci_visibility/recorder.py:943:16-115: Returned type `str | Any | None` is not assignable to declared return type `str` [bad-return]
- ERROR ddtrace/internal/coverage/instrumentation_py3_11.py:391:55-67: Argument `Unknown | None` is not assignable to parameter `end` with type `Instruction` in function `Branch.__init__` [bad-argument-type]
- ERROR ddtrace/internal/coverage/instrumentation_py3_11.py:392:9-29: Object of class `NoneType` has no attribute `targets` [missing-attribute]
- ERROR ddtrace/internal/sampling.py:174:85-99: Argument `Any | None` is not assignable to parameter `max_per_second` with type `int` in function `SpanSamplingRule.__init__` [bad-argument-type]
- ERROR ddtrace/llmobs/_integrations/langchain.py:847:16-72: Returned type `tuple[tuple[Unknown, Unknown, Unknown | None], str]` is not assignable to declared return type `tuple[tuple[int, int, int], str | None]` [bad-return]
- ERROR ddtrace/llmobs/_integrations/utils.py:314:94-101: Type `None` is not iterable [not-iterable]

pydantic (https://github.com/pydantic/pydantic)
- ERROR pydantic/_internal/_generate_schema.py:320:23-72: `None` is not subscriptable [unsupported-operation]

freqtrade (https://github.com/freqtrade/freqtrade)
- ERROR freqtrade/exchange/binance.py:92:20-95:14: Returned type `Any | None` is not assignable to declared return type `str` [bad-return]
- ERROR freqtrade/exchange/exchange.py:2964:21-34: Object of class `list` has no attribute `update` [missing-attribute]
- ERROR freqtrade/loggers/__init__.py:141:28-99: `Literal['/dev/log'] | tuple[Unknown, int] | Unknown` is not assignable to TypedDict key with type `str` [bad-typed-dict-key]
- ERROR freqtrade/loggers/__init__.py:180:33-49: `int` is not assignable to TypedDict key with type `str` [bad-typed-dict-key]
- ERROR freqtrade/loggers/__init__.py:181:36-38: `Literal[10]` is not assignable to TypedDict key with type `str` [bad-typed-dict-key]
- ERROR freqtrade/loggers/__init__.py:186:27-64: Object of class `bool` has no attribute `values`
- Object of class `int` has no attribute `values` [missing-attribute]
- ERROR freqtrade/optimize/hyperopt/hyperopt_auto.py:61:20-61: Returned type `Any | None` is not assignable to declared return type `(...) -> Unknown` [bad-return]
- ERROR freqtrade/plugins/pairlist/VolatilityFilter.py:146:13-27: Object of class `ndarray` has no attribute `fillna` [missing-attribute]
- ERROR freqtrade/plugins/pairlist/VolatilityFilter.py:148:33-48: Object of class `ndarray` has no attribute `rolling` [missing-attribute]

ibis (https://github.com/ibis-project/ibis)
- ERROR ibis/backends/sql/compilers/postgres.py:355:16-19: Expected a callable, got `None` [not-callable]
- ERROR ibis/legacy/udf/vectorized.py:38:12-46: Returned type `dict[@_, @_]` is not assignable to declared return type `tuple[Unknown, ...]` [bad-return]
+ ERROR ibis/legacy/udf/vectorized.py:38:12-46: Returned type `dict[str, @_]` is not assignable to declared return type `tuple[Unknown, ...]` [bad-return]

aioredis (https://github.com/aio-libs/aioredis)
- ERROR aioredis/client.py:164:31-51: Cannot set item in `dict[str, str]` [unsupported-operation]

scrapy (https://github.com/scrapy/scrapy)
- ERROR scrapy/core/http2/protocol.py:207:30-209:14: Argument `Any | None` is not assignable to parameter `download_maxsize` with type `int` in function `scrapy.core.http2.stream.Stream.__init__` [bad-argument-type]
- ERROR scrapy/core/http2/protocol.py:210:31-212:14: Argument `Any | None` is not assignable to parameter `download_warnsize` with type `int` in function `scrapy.core.http2.stream.Stream.__init__` [bad-argument-type]
+ ERROR scrapy/extensions/httpcache.py:210:23-42: No matching overload found for function `max` called with arguments: (Literal[0], float | int) [no-matching-overload]
- ERROR scrapy/statscollectors.py:78:31-74: No matching overload found for function `max` called with arguments: (Unknown | None, Any) [no-matching-overload]
- ERROR scrapy/statscollectors.py:81:31-74: No matching overload found for function `min` called with arguments: (Unknown | None, Any) [no-matching-overload]
- ERROR tests/test_proxy_connect.py:95:37-82: Cannot set item in `_Environ[str]` [unsupported-operation]

comtypes (https://github.com/enthought/comtypes)
- ERROR comtypes/_meta.py:16:14-33: Object of class `_Pointer` has no attribute `QueryInterface` [missing-attribute]

starlette (https://github.com/encode/starlette)
- ERROR starlette/requests.py:124:20-33: Object of class `NoneType` has no attribute `endswith` [missing-attribute]
- ERROR starlette/requests.py:125:17-28: `+=` is not supported between `None` and `Literal['/']` [unsupported-operation]

zulip (https://github.com/zulip/zulip)
- ERROR zerver/lib/message_cache.py:389:32-86: Argument `Any | None` is not assignable to parameter `rendering_realm_id` with type `int` in function `MessageDict.build_message_dict` [bad-argument-type]
- ERROR zerver/management/commands/backup.py:128:25-132:26: Argument `_TemporaryFileWrapper[bytes]` is not assignable to parameter `cm` with type `AbstractContextManager[_TemporaryFileWrapper[str]]` in function `contextlib._BaseExitStack.enter_context` [bad-argument-type]
- ERROR zerver/middleware.py:494:66-73: Argument `Any | None` is not assignable to parameter `urlconf` with type `str` in function `django.conf.urls.i18n.is_language_prefix_patterns_used` [bad-argument-type]

hydpy (https://github.com/hydpy-dev/hydpy)
- ERROR hydpy/models/conv/conv_model.py:394:41-55: Argument `ndarray[tuple[Any, ...], dtype[Unknown]]` is not assignable to parameter `double` with type `float` in function `hydpy.cythons.modelutils.isnan` [bad-argument-type]

pandas (https://github.com/pandas-dev/pandas)
- ERROR pandas/core/_numba/kernels/mean_.py:111:21-31: Argument `float | ndarray` is not assignable to parameter `prev_value` with type `float` in function `add_mean` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/mean_.py:136:21-31: Argument `float | ndarray` is not assignable to parameter `prev_value` with type `float` in function `add_mean` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/sum_.py:209:13-17: Argument `ndarray[tuple[Any, ...], dtype[signedinteger[_64Bit]]]` is not assignable to parameter `nobs` with type `int` in function `add_sum` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/sum_.py:212:13-39: Argument `ndarray[tuple[Any, ...], dtype[signedinteger[_64Bit]]]` is not assignable to parameter `num_consecutive_same_value` with type `int` in function `add_sum` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/var_.py:120:21-31: Argument `float | ndarray` is not assignable to parameter `prev_value` with type `float` in function `add_var` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/var_.py:145:21-31: Argument `float | ndarray` is not assignable to parameter `prev_value` with type `float` in function `add_var` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/var_.py:217:13-17: Argument `ndarray[tuple[Any, ...], dtype[signedinteger[_64Bit]]]` is not assignable to parameter `nobs` with type `int` in function `add_var` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/var_.py:218:13-19: Argument `ndarray` is not assignable to parameter `mean_x` with type `float` in function `add_var` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/var_.py:219:13-20: Argument `ndarray` is not assignable to parameter `ssqdm_x` with type `float` in function `add_var` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/var_.py:220:13-29: Argument `ndarray` is not assignable to parameter `compensation` with type `float` in function `add_var` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/var_.py:221:13-39: Argument `ndarray[tuple[Any, ...], dtype[signedinteger[_64Bit]]]` is not assignable to parameter `num_consecutive_same_value` with type `int` in function `add_var` [bad-argument-type]
- ERROR pandas/core/_numba/kernels/var_.py:222:13-23: Argument `ndarray` is not assignable to parameter `prev_value` with type `float` in function `add_var` [bad-argument-type]
- ERROR pandas/core/apply.py:592:63-67: Argument `DataFrame | Series | ndarray | Unknown` is not assignable to parameter `subset` with type `DataFrame | Series | None` in function `pandas.core.frame.DataFrame._gotitem` [bad-argument-type]
+ ERROR pandas/core/apply.py:592:63-67: Argument `DataFrame | Series | ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown` is not assignable to parameter `subset` with type `DataFrame | Series | None` in function `pandas.core.frame.DataFrame._gotitem` [bad-argument-type]
- ERROR pandas/core/array_algos/take.py:227:19-26: Argument `tuple[ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]]]` is not assignable to parameter `indexer` with type `ndarray` in function `wrapper` [bad-argument-type]
- ERROR pandas/core/arrays/boolean.py:264:12-24: Returned type `tuple[ndarray[tuple[int]] | ndarray, ndarray[tuple[int]] | ndarray[tuple[Any, ...], dtype[numpy.bool]] | ndarray | None]` is not assignable to declared return type `tuple[ndarray, ndarray]` [bad-return]
+ ERROR pandas/core/arrays/boolean.py:264:12-24: Returned type `tuple[ndarray[tuple[int]] | ndarray, ndarray[tuple[Any, ...], dtype[numpy.bool]] | ndarray | Unknown | None]` is not assignable to declared return type `tuple[ndarray, ndarray]` [bad-return]
- ERROR pandas/core/arrays/categorical.py:458:27-58: Object of class `ExtensionArray` has no attribute `_pa_array`
- Object of class `ndarray` has no attribute `_pa_array` [missing-attribute]
- ERROR pandas/core/arrays/categorical.py:494:25-53: Object of class `ExtensionArray` has no attribute `_codes`
- Object of class `ndarray` has no attribute `_codes` [missing-attribute]
- ERROR pandas/core/arrays/categorical.py:2307:33-62: Object of class `ndarray` has no attribute `_values` [missing-attribute]
- ERROR pandas/core/arrays/categorical.py:2308:33-63: Object of class `ndarray` has no attribute `_values` [missing-attribute]
- ERROR pandas/core/arrays/categorical.py:3162:35-52: Object of class `ExtensionArray` has no attribute `categories`
- Object of class `ndarray` has no attribute `categories` [missing-attribute]
- ERROR pandas/core/arrays/categorical.py:3162:61-73: Object of class `ExtensionArray` has no attribute `codes`
- Object of class `ndarray` has no attribute `codes` [missing-attribute]
- ERROR pandas/core/arrays/categorical.py:3166:17-29: Object of class `ExtensionArray` has no attribute `codes`
- Object of class `ndarray` has no attribute `codes` [missing-attribute]
- ERROR pandas/core/arrays/interval.py:678:29-33: Argument `ndarray | Unknown` is not assignable to parameter `left` with type `int` in function `pandas._libs.interval.Interval.__init__` [bad-argument-type]
- ERROR pandas/core/arrays/interval.py:678:35-40: Argument `ndarray | Unknown` is not assignable to parameter `right` with type `int` in function `pandas._libs.interval.Interval.__init__` [bad-argument-type]
- ERROR pandas/core/arrays/sparse/array.py:1713:20-29: Returned type `ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit]]]` is not assignable to declared return type `int` [bad-return]
- ERROR pandas/core/arrays/sparse/array.py:1715:20-29: Returned type `ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit]]]` is not assignable to declared return type `int` [bad-return]
- ERROR pandas/core/arrays/sparse/array.py:1717:20-29: Returned type `ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit]]]` is not assignable to declared return type `int` [bad-return]
- ERROR pandas/core/arrays/sparse/array.py:1721:20-29: Returned type `ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit]]]` is not assignable to declared return type `int` [bad-return]
- ERROR pandas/core/arrays/sparse/array.py:1723:20-24: Returned type `Literal[0] | ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit]]]` is not assignable to declared return type `int` [bad-return]
- ERROR pandas/core/arrays/string_.py:702:26-32: Argument `ExtensionArray | ndarray` is not assignable to parameter `values` with type `NumpyExtensionArray | ndarray` in function `pandas.core.arrays.numpy_.NumpyExtensionArray.__init__` [bad-argument-type]
- ERROR pandas/core/arrays/string_.py:873:25-39: Object of class `ExtensionArray` has no attribute `_ndarray`
+ ERROR pandas/core/arrays/string_.py:873:25-39: Object of class `NAType` has no attribute `_ndarray`
- Object of class `NAType` has no attribute `_ndarray`
- ERROR pandas/core/computation/ops.py:477:32-38: Argument `Any | None` is not assignable to parameter `cls` with type `type` in function `issubclass` [bad-argument-type]
- ERROR pandas/core/computation/ops.py:478:36-42: Argument `Any | None` is not assignable to parameter `cls` with type `type` in function `issubclass` [bad-argument-type]
- ERROR pandas/core/construction.py:611:51-55: Argument `ExtensionArray | ndarray | object` is not assignable to parameter `value` with type `Interval[Unknown] | Timedelta | Timestamp | bool | bytes | complex | complexfloating | date | datetime | datetime64 | float | floating | int | integer | str | timedelta | timedelta64` in function `pandas.core.dtypes.cast.construct_1d_arraylike_from_scalar` [bad-argument-type]
- ERROR pandas/core/construction.py:631:24-30: No matching overload found for function `list.__init__` called with arguments: (ExtensionArray | ndarray | object) [no-matching-overload]
- ERROR pandas/core/construction.py:684:20-26: No matching overload found for function `list.__init__` called with arguments: (ExtensionArray | object) [no-matching-overload]
- ERROR pandas/core/frame.py:2362:20-41: `in` is not supported between `ndarray` and `None` [not-iterable]
- ERROR pandas/core/frame.py:4156:28-41: `Index | ndarray | Any` is not assignable to attribute `_name` with type `Hashable` [bad-assignment]
+ ERROR pandas/core/frame.py:4156:28-41: `Index | Unknown` is not assignable to attribute `_name` with type `Hashable` [bad-assignment]
- ERROR pandas/core/frame.py:4292:47-58: Argument `Index | ndarray | Any` is not assignable to parameter `index` with type `Index` in function `pandas.core.indexes.multi.maybe_droplevels` [bad-argument-type]
- ERROR pandas/core/frame.py:4341:20-41: Returned type `ndarray | Any` is not assignable to declared return type `Interval[Unknown] | Timedelta | Timestamp | bool | bytes | complex | complexfloating | date | datetime | datetime64 | float | floating | int | integer | str | timedelta | timedelta64` [bad-return]
- ERROR pandas/core/frame.py:4647:51-55: Argument `Index | ndarray | Any` is not assignable to parameter `index` with type `Index` in function `pandas.core.indexes.multi.maybe_droplevels` [bad-argument-type]
- ERROR pandas/core/frame.py:4817:21-25: `Index | ndarray | Any` is not assignable to attribute `_name` with type `Hashable` [bad-assignment]
+ ERROR pandas/core/frame.py:4817:21-25: `Index | Unknown` is not assignable to attribute `_name` with type `Hashable` [bad-assignment]
- ERROR pandas/core/frame.py:6892:26-58: Object of class `ndarray` has no attribute `unique` [missing-attribute]
- ERROR pandas/core/frame.py:13719:16-27: Returned type `DataFrame | Series | ndarray | Unknown | Self@DataFrame` is not assignable to declared return type `DataFrame | Series` [bad-return]
+ ERROR pandas/core/frame.py:13719:16-27: Returned type `DataFrame | Series | ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown | Self@DataFrame` is not assignable to declared return type `DataFrame | Series` [bad-return]
- ERROR pandas/core/frame.py:18239:53-58: Argument `Index | ndarray | Any` is not assignable to parameter `key` with type `Hashable` in function `pandas.core.generic.NDFrame._get_label_or_level_values` [bad-argument-type]
+ ERROR pandas/core/frame.py:18239:53-58: Argument `Index | Unknown` is not assignable to parameter `key` with type `Hashable` in function `pandas.core.generic.NDFrame._get_label_or_level_values` [bad-argument-type]
- ERROR pandas/core/groupby/generic.py:1278:17-22: Argument `list[ndarray[tuple[int], dtype[Unknown]] | ndarray | ndarray[tuple[Any, ...], dtype[Unknown]]]` is not assignable to parameter `right_keys` with type `list[ArrayLike]` in function `pandas.core.reshape.merge.get_join_indexers` [bad-argument-type]
- ERROR pandas/core/groupby/ops.py:1190:9-29: Class member `BinGrouper.result_index_and_ids` overrides parent class `BaseGrouper` in an inconsistent manner [bad-override]
- ERROR pandas/core/indexes/base.py:732:22-64: Object of class `ndarray` has no attribute `unique` [missing-attribute]
- ERROR pandas/core/indexes/base.py:1027:16-23: Returned type `ndarray | Any | Self@Index` is not assignable to declared return type `Self@Index` [bad-return]
- ERROR pandas/core/indexes/base.py:3286:24-39: Object of class `ndarray` has no attribute `rename` [missing-attribute]
- ERROR pandas/core/indexes/base.py:3298:28-43: Object of class `ndarray` has no attribute `rename` [missing-attribute]
- ERROR pandas/core/indexes/base.py:3300:28-44: Object of class `ndarray` has no attribute `rename` [missing-attribute]
- ERROR pandas/core/indexes/base.py:3434:20-35: Object of class `ndarray` has no attribute `rename` [missing-attribute]
- ERROR pandas/core/indexes/base.py:3460:52-67: Object of class `ndarray` has no attribute `unique` [missing-attribute]
- ERROR pandas/core/indexes/base.py:4233:17-23: Argument `Index | MultiIndex | ndarray | Unknown | Self@Index` is not assignable to parameter `other` with type `Index` in function `Index._join_level` [bad-argument-type]
- ERROR pandas/core/indexes/base.py:4287:20-63: Returned type `tuple[ndarray | Any | Self@Index, ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]], None]` is not assignable to declared return type `tuple[Index, ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]] | None]` [bad-return]
- ERROR pandas/core/indexes/base.py:6316:30-81: Object of class `ndarray` has no attribute `unique` [missing-attribute]
- ERROR pandas/core/indexes/base.py:6609:23-37: Object of class `ndarray` has no attribute `array` [missing-attribute]
- ERROR pandas/core/indexes/base.py:8055:37-81: No matching overload found for function `numpy.ndarray.astype` called with arguments: (ArrowDtype) [no-matching-overload]
- ERROR pandas/core/indexes/datetimelike.py:769:21-35: Object of class `DatetimeTimedeltaMixin` has no attribute `_value`
+ ERROR pandas/core/indexes/datetimelike.py:769:21-35: Object of class `DatetimeTimedeltaMixin` has no attribute `_value` [missing-attribute]
- Object of class `ndarray` has no attribute `_value` [missing-attribute]
- ERROR pandas/core/indexes/datetimelike.py:769:37-52: Object of class `DatetimeTimedeltaMixin` has no attribute `_value`
+ ERROR pandas/core/indexes/datetimelike.py:769:37-52: Object of class `DatetimeTimedeltaMixin` has no attribute `_value` [missing-attribute]
- Object of class `ndarray` has no attribute `_value` [missing-attribute]
- ERROR pandas/core/indexes/datetimelike.py:853:45-50: Argument `ndarray | Unknown | Self@DatetimeTimedeltaMixin` is not assignable to parameter `start` with type `Hashable | None` in function `pandas.core.indexes.base.Index.slice_locs` [bad-argument-type]
+ ERROR pandas/core/indexes/datetimelike.py:853:45-50: Argument `Unknown | Self@DatetimeTimedeltaMixin` is not assignable to parameter `start` with type `Hashable | None` in function `pandas.core.indexes.base.Index.slice_locs` [bad-argument-type]
- ERROR pandas/core/indexes/datetimelike.py:853:52-55: Argument `ndarray | Unknown | Self@DatetimeTimedeltaMixin` is not assignable to parameter `end` with type `Hashable | None` in function `pandas.core.indexes.base.Index.slice_locs` [bad-argument-type]
+ ERROR pandas/core/indexes/datetimelike.py:853:52-55: Argument `Unknown | Self@DatetimeTimedeltaMixin` is not assignable to parameter `end` with type `Hashable | None` in function `pandas.core.indexes.base.Index.slice_locs` [bad-argument-type]
- ERROR pandas/core/indexes/interval.py:567:16-50: Object of class `ndarray` has no attribute `is_monotonic_increasing` [missing-attribute]
- ERROR pandas/core/indexes/interval.py:1256:16-26: Returned type `ndarray | Any | Self@IntervalIndex` is not assignable to declared return type `IntervalIndex` [bad-return]
- ERROR pandas/core/indexes/range.py:765:16-30: Returned type `tuple[ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], Index | int | ndarray | Any | Self@RangeIndex]` is not assignable to declared return type `tuple[ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]], RangeIndex]` [bad-return]
+ ERROR pandas/core/indexes/range.py:765:16-30: Returned type `tuple[ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], Index | int | Unknown | Self@RangeIndex]` is not assignable to declared return type `tuple[ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]], RangeIndex]` [bad-return]
- ERROR pandas/core/indexes/range.py:837:20-49: Returned type `tuple[Index | int | ndarray | Any | Self@RangeIndex, RangeIndex]` is not assignable to declared return type `tuple[Self@RangeIndex, RangeIndex | ndarray] | Self@RangeIndex` [bad-return]
+ ERROR pandas/core/indexes/range.py:837:20-49: Returned type `tuple[Index | int | Unknown | Self@RangeIndex, RangeIndex]` is not assignable to declared return type `tuple[Self@RangeIndex, RangeIndex | ndarray] | Self@RangeIndex` [bad-return]
- ERROR pandas/core/indexes/range.py:839:20-32: Returned type `Index | int | ndarray | Any | Self@RangeIndex` is not assignable to declared return type `tuple[Self@RangeIndex, RangeIndex | ndarray] | Self@RangeIndex` [bad-return]
+ ERROR pandas/core/indexes/range.py:839:20-32: Returned type `Index | int | Unknown | Self@RangeIndex` is not assignable to declared return type `tuple[Self@RangeIndex, RangeIndex | ndarray] | Self@RangeIndex` [bad-return]
- ERROR pandas/core/indexes/range.py:997:20-42: Object of class `int` has no attribute `_difference`
+ ERROR pandas/core/indexes/range.py:997:20-42: Object of class `int` has no attribute `_difference` [missing-attribute]
- Object of class `ndarray` has no attribute `_difference` [missing-attribute]
- ERROR pandas/core/indexes/range.py:1006:16-23: Argument `Index | int | ndarray | Unknown | Self@RangeIndex` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
+ ERROR pandas/core/indexes/range.py:1006:16-23: Argument `Index | int | Unknown | Self@RangeIndex` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
- ERROR pandas/core/indexes/range.py:1008:16-23: Argument `Index | int | ndarray | Unknown | Self@RangeIndex` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
+ ERROR pandas/core/indexes/range.py:1008:16-23: Argument `Index | int | Unknown | Self@RangeIndex` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
- ERROR pandas/core/indexes/range.py:1009:20-35: Object of class `int` has no attribute `rename`
+ ERROR pandas/core/indexes/range.py:1009:20-35: Object of class `int` has no attribute `rename` [missing-attribute]
- Object of class `ndarray` has no attribute `rename` [missing-attribute]
- ERROR pandas/core/indexes/range.py:1013:16-23: Argument `Index | int | ndarray | Unknown | Self@RangeIndex` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
+ ERROR pandas/core/indexes/range.py:1013:16-23: Argument `Index | int | Unknown | Self@RangeIndex` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
- ERROR pandas/core/indexes/range.py:1026:18-25: Argument `Index | int | ndarray | Unknown | Self@RangeIndex` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
+ ERROR pandas/core/indexes/range.py:1026:18-25: Argument `Index | int | Unknown | Self@RangeIndex` is not assignable to parameter `obj` with type `Sized` in function `len` [bad-argument-type]
- Object of class `int` has no attribute `step`
- Object of class `ndarray` has no attribute `step` [missing-attribute]
+ Object of class `int` has no attribute `step` [missing-attribute]
- ERROR pandas/core/indexes/range.py:1036:32-69: No matching overload found for function `range.__new__` called with arguments: (type[range], int, Index | int | ndarray | Unknown | Self@RangeIndex, int) [no-matching-overload]
+ ERROR pandas/core/indexes/range.py:1036:32-69: No matching overload found for function `range.__new__` called with arguments: (type[range], int, Index | int | Unknown | Self@RangeIndex, int) [no-matching-overload]
- Object of class `int` has no attribute `_range`
- Object of class `ndarray` has no attribute `_range` [missing-attribute]
+ Object of class `int` has no attribute `_range` [missing-attribute]
- ERROR pandas/core/indexes/range.py:1139:24-32: Returned type `Index | int | ndarray | Any | Self@RangeIndex` is not assignable to declared return type `Index` [bad-return]
+ ERROR pandas/core/indexes/range.py:1139:24-32: Returned type `Index | int | Unknown | Self@RangeIndex` is not assignable to declared return type `Index` [bad-return]
- ERROR pandas/core/indexes/range.py:1141:24-33: Returned type `Index | int | ndarray | Any | Self@RangeIndex` is not assignable to declared return type `Index` [bad-return]
+ ERROR pandas/core/indexes/range.py:1141:24-33: Returned type `Index | int | Unknown | Self@RangeIndex` is not assignable to declared return type `Index` [bad-return]
- ERROR pandas/core/indexes/range.py:1143:24-33: Returned type `Index | int | ndarray | Any | Self@RangeIndex` is not assignable to declared return type `Index` [bad-return]
+ ERROR pandas/core/indexes/range.py:1143:24-33: Returned type `Index | int | Unknown | Self@RangeIndex` is not assignable to declared return type `Index` [bad-return]
- ERROR pandas/core/internals/blocks.py:1204:29-46: Cannot index into `ExtensionArray` [bad-index]
- ERROR pandas/core/internals/construction.py:259:22-36: Object of class `ExtensionArray` has no attribute `reshape` [missing-attribute]
- ERROR pandas/core/internals/construction.py:299:39-45: Argument `ExtensionArray | ndarray | Unknown` is not assignable to parameter `values` with type `ndarray` in function `_check_values_indices_shape_match` [bad-argument-type]
- ERROR pandas/core/internals/construction.py:791:24-39: Returned type `tuple[list[ndarray], Index]` is not assignable to declared return type `tuple[list[ArrayLike], Index]` [bad-return]
- ERROR pandas/core/internals/construction.py:799:16-31: Returned type `tuple[list[ndarray], Index]` is not assignable to declared return type `tuple[list[ArrayLike], Index]` [bad-return]
- ERROR pandas/core/internals/construction.py:854:24-40: No matching overload found for function `numpy._core.shape_base.vstack` called with arguments: (list[ExtensionArray | ndarray]) [no-matching-overload]
- ERROR pandas/core/internals/managers.py:445:53-84: Argument `list[DatetimeIndex | Index | TimedeltaIndex | ndarray[tuple[Any, ...], Unknown] | Unknown]` is not assignable to parameter `axes` with type `list[Index]` in function `BaseBlockManager.from_blocks` [bad-argument-type]
- ERROR pandas/core/internals/managers.py:573:30-45: Cannot index into `ExtensionArray` [bad-index]
- ERROR pandas/core/internals/managers.py:735:20-28: `list[DatetimeIndex | Index | TimedeltaIndex | ndarray[tuple[Any, ...], Unknown] | Unknown]` is not assignable to attribute `axes` with type `list[Index]` [bad-assignment]
- ERROR pandas/core/internals/managers.py:870:13-31: Cannot set item in `list[Index]` [unsupported-operation]
- ERROR pandas/core/internals/managers.py:1158:46-65: Argument `DatetimeIndex | Index | TimedeltaIndex | ndarray[tuple[Any, ...], Unknown] | Unknown` is not assignable to parameter `axis` with type `Index` in function `SingleBlockManager.__init__` [bad-argument-type]
- ERROR pandas/core/internals/managers.py:1203:42-61: Argument `DatetimeIndex | Index | TimedeltaIndex | ndarray[tuple[Any, ...], Unknown] | Unknown` is not assignable to parameter `axis` with type `Index` in function `SingleBlockManager.__init__` [bad-argument-type]
- ERROR pandas/core/internals/managers.py:1217:39-58: Argument `DatetimeIndex | Index | TimedeltaIndex | ndarray[tuple[Any, ...], Unknown] | Unknown` is not assignable to parameter `axis` with type `Index` in function `SingleBlockManager.__init__` [bad-argument-type]
- ERROR pandas/core/internals/managers.py:1619:39-43: Argument `list[Index | ndarray | Any]` is not assignable to parameter `axes` with type `Sequence[Index]` in function `BlockManager.__init__` [bad-argument-type]
- ERROR pandas/core/internals/managers.py:1656:54-82: Argument `list[DatetimeIndex | Index | TimedeltaIndex | ndarray[tuple[Any, ...], Unknown] | Unknown]` is not assignable to parameter `axes` with type `list[Index]` in function `BlockManager.from_blocks` [bad-argument-type]
- ERROR pandas/core/internals/managers.py:1723:35-43: Argument `list[DatetimeIndex | Index | TimedeltaIndex | ndarray[tuple[Any, ...], Unknown] | Unknown]` is not assignable to parameter `axes` with type `Sequence[Index]` in function `BlockManager.__init__` [bad-argument-type]
- ERROR pandas/core/internals/managers.py:2053:45-49: Argument `list[DatetimeIndex | Index | TimedeltaIndex | ndarray[tuple[Any, ...], Unknown] | Unknown]` is not assignable to parameter `axes` with type `Sequence[Index]` in function `BlockManager.__init__` [bad-argument-type]
- ERROR pandas/core/internals/managers.py:2151:34-41: Argument `Index | ndarray | Any` is not assignable to parameter `axis` with type `Index` in function `SingleBlockManager.__init__` [bad-argument-type]
- ERROR pandas/core/methods/selectn.py:116:20-32: Returned type `Series | ndarray | Unknown` is not assignable to declared return type `Series` [bad-return]
+ ERROR pandas/core/methods/selectn.py:116:20-32: Returned type `Series | ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown` is not assignable to declared return type `Series` [bad-return]
- ERROR pandas/core/resample.py:2664:22-35: Object of class `ndarray` has no attribute `insert` [missing-attribute]
- ERROR pandas/core/resample.py:2707:26-37: `DatetimeIndex | ndarray | Any` is not assignable to variable `binner` with type `DatetimeIndex` [bad-assignment]
- ERROR pandas/core/resample.py:2831:14-23: Object of class `ndarray` has no attribute `asi8` [missing-attribute]
- ERROR pandas/core/resample.py:2842:21-31: Object of class `ndarray` has no attribute `_data` [missing-attribute]
- ERROR pandas/core/reshape/concat.py:722:23-67: Object of class `ndarray` has no attribute `unique`

... (truncated 783 lines) ...

pip (https://github.com/pypa/pip)
- ERROR src/pip/_internal/network/auth.py:336:16-54: Returned type `Literal[b'']` is not assignable to declared return type `str | None` [bad-return]
- ERROR src/pip/_vendor/pygments/lexer.py:918:9-23: Class member `ProfilingRegexLexerMeta._process_regex` overrides parent class `RegexLexerMeta` in an inconsistent manner [bad-param-name-override]
+ ERROR src/pip/_vendor/rich/progress.py:677:26-45: No matching overload found for function `max` called with arguments: (Literal[0], float) [no-matching-overload]

tornado (https://github.com/tornadoweb/tornado)
+ ERROR tornado/platform/asyncio.py:223:16-39: No matching overload found for function `max` called with arguments: (Literal[0], float) [no-matching-overload]
+ ERROR tornado/websocket.py:1356:19-55: No matching overload found for function `max` called with arguments: (Literal[0], float) [no-matching-overload]

egglog-python (https://github.com/egraphs-good/egglog-python)
- ERROR python/egglog/exp/array_api_numba.py:50:50-78: Object of class `int` has no attribute `to_value` [missing-attribute]
- ERROR python/tests/__snapshots__/test_array_api/test_jit[lda][expr].py:13:17-78: Object of class `int` has no attribute `to_value` [missing-attribute]
- ERROR python/tests/__snapshots__/test_array_api/test_jit[lda][expr].py:14:17-78: Object of class `int` has no attribute `to_value` [missing-attribute]
- ERROR python/tests/__snapshots__/test_array_api/test_jit[lda][expr].py:15:17-78: Object of class `int` has no attribute `to_value` [missing-attribute]
- ERROR python/tests/__snapshots__/test_array_api/test_jit[lda][expr].py:54:53-156: Object of class `int` has no attribute `to_value` [missing-attribute]
- ERROR python/tests/__snapshots__/test_array_api/test_jit[lda][initial_expr].py:59:53-156: Object of class `int` has no attribute `to_value` [missing-attribute]

porcupine (https://github.com/Akuli/porcupine)
- ERROR porcupine/plugins/highlight/tree_sitter_highlighter.py:83:20-72: Returned type `Unknown | None` is not assignable to declared return type `str` [bad-return]

cwltool (https://github.com/common-workflow-language/cwltool)
- ERROR cwltool/checker.py:462:47-54: Cannot set item in `MutableMapping[str, MutableMapping[str, CWLOutputType] | MutableSequence[CWLOutputType] | bool | float | int | str | None]` [unsupported-operation]
- ERROR cwltool/command_line_tool.py:853:78-98: `dict[str, list[MutableMapping[str, CWLOutputType] | MutableSequence[CWLOutputType] | bool | float | int | str] | list[Any]]` is not assignable to `dict[str, MutableMapping[str, MutableMapping[str, CWLOutputType] | MutableSequence[CWLOutputType] | bool | float | int | str | None] | MutableSequence[int | str]]` [bad-assignment]

scipy-stubs (https://github.com/scipy/scipy-stubs)
+ ERROR tests/linalg/test__sketches.pyi:36:12-71: assert_type(Unknown, ndarray) failed [assert-type]
+ ERROR tests/linalg/test__sketches.pyi:43:12-73: assert_type(Unknown, csc_matrix) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:81:12-53: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[unsignedinteger[_8Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:86:12-56: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:91:12-65: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:114:12-54: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[unsignedinteger[_8Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:119:12-57: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:142:12-66: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[unsignedinteger[_8Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:147:12-69: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:166:12-52: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[unsignedinteger[_8Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:171:12-55: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:190:12-56: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:195:12-57: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:214:12-64: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/linalg/test__special_matrices.pyi:219:12-65: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/ndimage/test__filters.pyi:294:12-70: assert_type(Unknown, ndarray[tuple[int, int], dtype[float64]]) failed [assert-type]
+ ERROR tests/ndimage/test__filters.pyi:295:12-72: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/ndimage/test__filters.pyi:301:12-81: assert_type(Unknown, ndarray[tuple[int, int], dtype[float64]]) failed [assert-type]
+ ERROR tests/ndimage/test__filters.pyi:302:12-83: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/signal/test_czt.pyi:29:12-58: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[_Complex]]) failed [assert-type]
+ ERROR tests/signal/test_czt.pyi:38:12-59: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[_Complex]]) failed [assert-type]
+ ERROR tests/signal/test_czt.pyi:51:12-48: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[_Complex]]) failed [assert-type]
+ ERROR tests/signal/test_czt.pyi:59:12-58: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[_Complex]]) failed [assert-type]
+ ERROR tests/signal/test_signaltools.pyi:328:12-57: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[complex128]]) failed [assert-type]
+ ERROR tests/signal/test_signaltools.pyi:329:12-57: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[complexfloating[_32Bit, _32Bit]]]) failed [assert-type]
+ ERROR tests/signal/test_signaltools.pyi:330:12-57: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[complexfloating[_32Bit, _32Bit]]]) failed [assert-type]
+ ERROR tests/signal/test_signaltools.pyi:331:12-58: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[complex128]]) failed [assert-type]
+ ERROR tests/signal/test_signaltools.pyi:332:12-67: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[cfloating80]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:32:12-107: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:36:12-107: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:40:12-108: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:44:12-104: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:50:12-106: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:54:12-106: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:58:12-107: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:62:12-103: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:68:12-108: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:72:12-108: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:76:12-109: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/sparse/linalg/test__expm_multiply.pyi:80:12-105: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/spatial/test__rigid_transform.pyi:88:12-65: assert_type(Unknown, RigidTransform) failed [assert-type]
+ ERROR tests/spatial/test__rigid_transform.pyi:98:12-70: assert_type(Unknown, RigidTransform) failed [assert-type]
+ ERROR tests/spatial/test__rigid_transform.pyi:107:12-78: assert_type(Unknown, RigidTransform) failed [assert-type]
+ ERROR tests/spatial/test__rigid_transform.pyi:108:12-78: assert_type(Unknown, RigidTransform) failed [assert-type]
+ ERROR tests/spatial/test__rigid_transform.pyi:109:12-78: assert_type(Unknown, RigidTransform) failed [assert-type]
+ ERROR tests/spatial/test__rigid_transform.pyi:114:12-69: assert_type(Unknown, RigidTransform) failed [assert-type]
+ ERROR tests/spatial/test__rigid_transform.pyi:119:12-68: assert_type(Unknown, RigidTransform) failed [assert-type]
+ ERROR tests/spatial/test__rigid_transform.pyi:130:12-74: assert_type(Unknown, RigidTransform) failed [assert-type]
+ ERROR tests/spatial/test__rotation.pyi:37:12-51: assert_type(Unknown, Rotation) failed [assert-type]
+ ERROR tests/spatial/test__rotation.pyi:44:12-53: assert_type(Unknown, Rotation) failed [assert-type]
+ ERROR tests/spatial/test__rotation.pyi:48:12-50: assert_type(Unknown, Rotation) failed [assert-type]
+ ERROR tests/spatial/test__rotation.pyi:81:12-64: assert_type(Unknown, Rotation) failed [assert-type]
+ ERROR tests/stats/test__stats_py.pyi:310:12-57: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:119:12-73: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:120:12-73: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:121:12-73: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:122:12-73: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:127:12-55: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:128:12-55: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:129:12-55: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:130:12-55: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:137:12-70: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:138:12-70: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[float64]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:139:12-70: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]
+ ERROR tests/stats/test_lmoment.pyi:140:12-70: assert_type(Unknown, ndarray[tuple[Any, ...], dtype[floating[_32Bit]]]) failed [assert-type]

materialize (https://github.com/MaterializeInc/materialize)
- ERROR misc/python/materialize/scratch.py:345:9-350:10: Argument `Future[list[None]]` is not assignable to parameter `future` with type `Awaitable[tuple[None]]` in function `asyncio.events.AbstractEventLoop.run_until_complete` [bad-argument-type]
- ERROR misc/python/materialize/workload_replay/data.py:336:29-337:50: `*` is not supported between `None` and `float` [unsupported-operation]

aiohttp (https://github.com/aio-libs/aiohttp)
- ERROR aiohttp/web.py:411:29-79: Argument `Future[list[BaseException | Any]]` is not assignable to parameter `future` with type `Awaitable[tuple[Any]]` in function `asyncio.events.AbstractEventLoop.run_until_complete` [bad-argument-type]

core (https://github.com/home-assistant/core)
- ERROR homeassistant/components/airos/config_flow.py:354:38-66: Argument `dict[str, Any | None]` is not assignable to parameter `description_placeholders` with type `Mapping[str, str] | None` in function `homeassistant.config_entries.ConfigFlow.async_show_form` [bad-argument-type]
- ERROR homeassistant/components/application_credentials/__init__.py:144:16-27: Returned type `dict[Unknown | None, ClientCredential]` is not assignable to declared return type `dict[str, ClientCredential]` [bad-return]
- ERROR homeassistant/components/aprilaire/entity.py:37:27-39:69: `Any | None` is not assignable to `bool` [bad-assignment]
- ERROR homeassistant/components/aprilaire/entity.py:41:25-75: `Any | None` is not assignable to `bool` [bad-assignment]
- ERROR homeassistant/components/bayesian/config_flow.py:558:31-83: Argument `Any | None` is not assignable to parameter `title` with type `UndefinedType | str` in function `homeassistant.config_entries.ConfigSubentryFlow.async_update_and_abort` [bad-argument-type]
+ ERROR homeassistant/components/bluesound/media_player.py:696:21-32: No matching overload found for function `max` called with arguments: (Literal[0], float) [no-matching-overload]
- ERROR homeassistant/components/ezviz/select.py:79:12-34: Object of class `NoneType` has no attribute `name` [missing-attribute]
- ERROR homeassistant/components/fritz/services.py:65:17-73: Argument `Any | None` is not assignable to parameter `length` with type `int` in function `homeassistant.components.fritz.coordinator.FritzBoxTools.async_trigger_set_guest_password` [bad-argument-type]
- ERROR homeassistant/components/gogogate2/config_flow.py:96:54-59: Argument `Any | None` is not assignable to parameter `title` with type `str` in function `homeassistant.config_entries.ConfigFlow.async_create_entry` [bad-argument-type]
- ERROR homeassistant/components/google_assistant/smart_home.py:256:17-261:18: Argument `Future[list[Unknown]]` is not assignable to parameter `arg` with type `Awaitable[tuple[Unknown]] | Future[tuple[Unknown]]` in function `asyncio.tasks.shield` [bad-argument-type]
- ERROR homeassistant/components/group/__init__.py:237:17-55: Argument `Any | None` is not assignable to parameter `name` with type `str` in function `homeassistant.components.group.entity.Group.async_create_group` [bad-argument-type]
- ERROR homeassistant/components/hassio/backup.py:139:14-141:10: Argument `Unknown | None` is not assignable to parameter `date` with type `str` in function `homeassistant.components.backup.models.AgentBackup.__init__` [bad-argument-type]
- ERROR homeassistant/components/hassio/coordinator.py:392:46-80: Argument `dict_values[Unknown, dict[str | Any, Unknown | None]]` is not assignable to parameter `addons` with type `list[dict[str, Any]]` in function `async_register_addons_in_dev_reg` [bad-argument-type]
+ ERROR homeassistant/components/hassio/coordinator.py:392:46-80: Argument `dict_values[Unknown, dict[str | Any, Unknown]]` is not assignable to parameter `addons` with type `list[dict[str, Any]]` in function `async_register_addons_in_dev_reg` [bad-argument-type]
- ERROR homeassistant/components/homeassistant/__init__.py:433:46-60: Argument `dict[str, Any | None]` is not assignable to parameter `translation_placeholders` with type `dict[str, str] | None` in function `homeassistant.helpers.issue_registry.async_create_issue` [bad-argument-type]
- ERROR homeassistant/components/hydrawise/sensor.py:143:16-49: Object of class `NoneType` has no attribute `total_use` [missing-attribute]
- ERROR homeassistant/components/hyperion/camera.py:160:20-66: Object of class `NoneType` has no attribute `get` [missing-attribute]
- ERROR homeassistant/components/knx/number.py:136:39-139:10: `Any | None` is not assignable to attribute `_attr_native_max_value` with type `float` [bad-assignment]
- ERROR homeassistant/components/knx/number.py:140:39-143:10: `Any | None` is not assignable to attribute `_attr_native_min_value` with type `float` [bad-assignment]
- ERROR homeassistant/components/knx/number.py:144:34-147:10: `Any | None` is not assignable to attribute `_attr_native_step` with type `float` [bad-assignment]
- ERROR homeassistant/components/knx/number.py:154:46-78: No matching overload found for function `max` called with arguments: (Literal[0], float) [no-matching-overload]
- ERROR homeassistant/components/litejet/light.py:110:73-85: No matching overload found for function `int.__new__` called with arguments: (type[int], Any | None) [no-matching-overload]
+ ERROR homeassistant/components/media_player/__init__.py:1051:20-61: No matching overload found for function `min` called with arguments: (Literal[1], float) [no-matching-overload]
+ ERROR homeassistant/components/media_player/__init__.py:1069:20-61: No matching overload found for function `max` called with arguments: (Literal[0], float) [no-matching-overload]
- ERROR homeassistant/components/mjpeg/config_flow.py:162:27-80: Argument `Any | None` is not assignable to parameter `title` with type `str` in function `homeassistant.config_entries.ConfigFlow.async_create_entry` [bad-argument-type]
- ERROR homeassistant/components/modbus/modbus.py:179:16-46: Returned type `tuple[ModbusHub | Unknown, Any | None, Any]` is not assignable to declared return type `tuple[ModbusHub, int, int]` [bad-return]
- ERROR homeassistant/components/nfandroidtv/notify.py:105:35-107:22: No matching overload found for function `int.__new__` called with arguments: (type[int], Unknown | None) [no-matching-overload]
- ERROR homeassistant/components/omnilogic/sensor.py:136:13-138:14: Invalid key for TypedDict `<anonymous>`, got `None` [bad-typed-dict-key]
- ERROR homeassistant/components/orvibo/config_flow.py:204:19-67: Argument `Any | None` is not assignable to parameter `title` with type `str` in function `homeassistant.config_entries.ConfigFlow.async_create_entry` [bad-argument-type]
+ ERROR homeassistant/components/refoss/sensor.py:101:29-35: No matching overload found for function `max` called with arguments: (Literal[0], float) [no-matching-overload]
- ERROR homeassistant/components/shelly/config_flow.py:1249:57-61: Argument `Any | None` is not assignable to parameter `port` with type `int` in function `ShellyConfigFlow._async_get_info` [bad-argument-type]
+ ERROR homeassistant/components/smartthings/sensor.py:383:26-71: Argument `(value: Any) -> object | str | None` is not assignable to parameter `value_fn` with type `(Any) -> datetime | float | int | str | None` in function `SmartThingsSensorEntityDescription.__init__` [bad-argument-type]
+ ERROR homeassistant/components/smartthings/sensor.py:436:26-71: Argument `(value: Any) -> object | str | None` is not assignable to parameter `value_fn` with type `(Any) -> datetime | float | int | str | None` in function `SmartThingsSensorEntityDescription.__init__` [bad-argument-type]
+ ERROR homeassistant/components/smartthings/sensor.py:1181:26-71: Argument `(value: Any) -> object | str | None` is not assignable to parameter `value_fn` with type `(Any) -> datetime | float | int | str | None` in function `SmartThingsSensorEntityDescription.__init__` [bad-argument-type]
+ ERROR homeassistant/components/smartthings/sensor.py:1381:13-1383:68: Returned type `UnitOfPressure | UnitOfTemperature | UnitOfVolume | object | str | None` is not assignable to declared return type `str | None` [bad-return]
- ERROR homeassistant/components/snmp/switch.py:261:29-74: Expected a callable, got `None` [not-callable]
- ERROR homeassistant/components/ssdp/__init__.py:62:9-65:10: Argument `((SsdpServiceInfo, SsdpChange) -> Coroutine[Any, Any, None]) | ((SsdpServiceInfo, SsdpChange) -> None)` is not assignable to parameter `target` with type `(SsdpServiceInfo, SsdpChange) -> Coroutine[Any, Any, None]` in function `homeassistant.core.HassJob.__init__` [bad-argument-type]
+ ERROR homeassistant/components/ssdp/__init__.py:62:9-65:10: Argument `((SsdpServiceInfo, SsdpChange) -> Coroutine[Any, Any, None]) | ((SsdpServiceInfo, SsdpChange) -> None)` is not assignable to parameter `target` with type `(**tuple[Unknown, ...]) -> Coroutine[Any, Any, None]` in function `homeassistant.core.HassJob.__init__` [bad-argument-type]
- ERROR homeassistant/components/switchbot/__init__.py:227:22-25: Expected a callable, got `None` [not-callable]
- ERROR homeassistant/components/switchbot/__init__.py:241:18-21: Expected a callable, got `None` [not-callable]
- ERROR homeassistant/components/volumio/config_flow.py:88:30-58: `Any | None` is not assignable to attribute `_name` with type `str` [bad-assignment]
+ ERROR homeassistant/components/wilight/light.py:112:15-51: No matching overload found for function `min` called with arguments: (Literal[360], float) [no-matching-overload]
+ ERROR homeassistant/components/wilight/light.py:122:15-51: No matching overload found for function `min` called with arguments: (Literal[100], float) [no-matching-overload]
- ERROR homeassistant/components/yandextts/tts.py:144:71-80: Argument `dict[str, str | Unknown | None]` is not assignable to parameter `params` with type `Mapping[str, Sequence[SupportsInt | float | str] | SupportsInt | float | str] | Sequence[tuple[str, Sequence[SupportsInt | float | str] | SupportsInt | float | str]] | str | None` in function `aiohttp.client.ClientSession.get` [bad-argument-type]
- ERROR homeassistant/core_config.py:143:25-28: Argument `Any | None` is not assignable to parameter `element` with type `str` in function `set.add` [bad-argument-type]
- ERROR homeassistant/helpers/integration_platform.py:184:9-187:10: Argument `((HomeAssistant, str, Any) -> Coroutine[Any, Any, None]) | ((HomeAssistant, str, Any) -> None)` is not assignable to parameter `target` with type `(HomeAssistant, str, Any) -> Coroutine[Any, Any, None]` in function `homeassistant.core.HassJob.__init__` [bad-argument-type]
+ ERROR homeassistant/helpers/integration_platform.py:184:9-187:10: Argument `((HomeAssistant, str, Any) -> Coroutine[Any, Any, None]) | ((HomeAssistant, str, Any) -> None)` is not assignable to parameter `target` with type `(**tuple[Unknown, ...]) -> Coroutine[Any, Any, None]` in function `homeassistant.core.HassJob.__init__` [bad-argument-type]
- ERROR homeassistant/helpers/script.py:1041:24-78: Argument `Any | None` is not assignable to parameter `default_message` with type `str` in function `_ScriptRun._step_log` [bad-argument-type]

mkosi (https://github.com/systemd/mkosi)
- ERROR mkosi/config.py:6039:29-53: Argument `Any | None` is not assignable to parameter `file_id` with type `str` in function `Drive.__init__` [bad-argument-type]
- ERROR mkosi/qemu.py:604:42-96: Argument `_TemporaryFileWrapper[bytes]` is not assignable to parameter `cm` with type `AbstractContextManager[_TemporaryFileWrapper[str]]` in function `contextlib._BaseExitStack.enter_context` [bad-argument-type]

trio (https://github.com/python-trio/trio)
- ERROR src/trio/_core/_parking_lot.py:200:23-30: No matching overload found for function `range.__new__` called with arguments: (type[range], float | int) [no-matching-overload]
+ ERROR src/trio/_core/_parking_lot.py:199:24-50: No matching overload found for function `min` called with arguments: (Never, int) [no-matching-overload]

mitmproxy (https://github.com/mitmproxy/mitmproxy)
- ERROR mitmproxy/flowfilter.py:266:40-66: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:268:41-68: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:280:40-66: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:291:41-68: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:307:34-43: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:313:34-43: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:317:67-81: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:324:44-69: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:326:45-71: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:343:34-43: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:347:59-73: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:354:44-69: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:370:34-43: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:374:63-77: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/flowfilter.py:381:45-71: No matching overload found for function `re.Pattern.search` called with arguments: (bytes) [no-matching-overload]
- ERROR mitmproxy/tls.py:107:16-19: Returned type `list[tuple[Unknown, Any | None]]` is not assignable to declared return type `list[tuple[int, bytes]]` [bad-return]

scipy (https://github.com/scipy/scipy)
- ERROR scipy/io/matlab/_mio5.py:519:5-22: Cannot set item in `ndarray[tuple[()], dtype[float64]]` [unsupported-operation]
- ERROR scipy/ndimage/_support_alternative_backends.py:116:9-21: Expected a callable, got `None` [not-callable]
- ERROR scipy/signal/_support_alternative_backends.py:401:13-25: Expected a callable, got `None` [not-callable]
+ ERROR subprojects/pyprima/pyprima/pyprima/src/pyprima/cobyla/trustregion.py:232:21-41: Cannot set item in `ndarray` [unsupported-operation]
+ ERROR subprojects/pyprima/pyprima/pyprima/src/pyprima/cobyla/trustregion.py:232:44-64: Cannot index into `ndarray` [bad-index]
+ ERROR subprojects/pyprima/pyprima/pyprima/src/pyprima/cobyla/trustregion.py:266:17-39: Cannot set item in `ndarray` [unsupported-operation]
+ ERROR subprojects/pyprima/pyprima/pyprima/src/pyprima/cobyla/trustregion.py:266:42-64: Cannot index into `ndarray` [bad-index]

colour (https://github.com/colour-science/colour)
- ERROR colour/io/luts/lut.py:1433:23-1438:18: No matching overload found for function `numpy.lib._arraypad_impl.pad` called with arguments: (Unknown, tuple[Literal[0], signedinteger[_16Bit] | signedinteger[_32Bit] | signedinteger[_64Bit] | signedinteger[_8Bit] | unsignedinteger[_16Bit] | unsignedinteger[_32Bit] | unsignedinteger[_64Bit] | unsignedinteger[_8Bit]], mode=Literal['constant'], constant_values=float) [no-matching-overload]
- ERROR colour/io/luts/lut.py:2097:12-29: `>` is not supported between `None` and `Literal[129]` [unsupported-operation]
- ERROR colour/io/luts/lut.py:2119:37-2121:14: No matching overload found for function `numpy._core.fromnumeric.reshape` called with arguments: (Any, tuple[Any | None, Any | None, Any | None, Literal[3]]) [no-matching-overload]
- ERROR colour/io/luts/lut.py:2135:37-2138:14: No matching overload found for function `numpy._core.fromnumeric.reshape` called with arguments: (Any, tuple[Any | None, Any | None, Any | None, Literal[3]]) [no-matching-overload]
+ ERROR colour/io/luts/tests/test_lut.py:638:13-642:14: Argument `list[_NestedSequence[bytes | complex | str] | bytes | complex | ndarray[tuple[Any, ...], dtype[float64]] | str]` is not assignable to parameter `a` with type `Buffer | _NestedSequence[bytes | complex | str] | _NestedSequence[_SupportsArray[dtype]] | _SupportsArray[dtype] | bytes | complex | str` in function `colour.utilities.array.tstack` [bad-argument-type]
+ ERROR colour/io/luts/tests/test_lut.py:650:21-654:22: Argument `list[_NestedSequence[bytes | complex | str] | bytes | complex | ndarray[tuple[Any, ...], dtype[float64]] | str]` is not assignable to parameter `a` with type `Buffer | _NestedSequence[bytes | complex | str] | _NestedSequence[_SupportsArray[dtype]] | _SupportsArray[dtype] | bytes | complex | str` in function `colour.utilities.array.tstack` [bad-argument-type]
+ ERROR colour/io/luts/tests/test_lut.py:1190:13-1194:14: Argument `list[_NestedSequence[bytes | complex | str] | bytes | complex | ndarray[tuple[Any, ...], dtype[float64]] | str]` is not assignable to parameter `a` with type `Buffer | _NestedSequence[bytes | complex | str] | _NestedSequence[_SupportsArray[dtype]] | _SupportsArray[dtype] | bytes | complex | str` in function `colour.utilities.array.tstack` [bad-argument-type]
- ERROR colour/notation/munsell.py:2573:13-43: Argument `tuple[Any, Any, Any]` is not assignable to parameter `value` with type `tuple[float, float]` in function `list.index` [bad-argument-type]
+ ERROR colour/notation/munsell.py:2573:13-43: Argument `tuple[Any, Unknown, Any]` is not assignable to parameter `value` with type `tuple[float, float]` in function `list.index` [bad-argument-type]
- ERROR colour/notation/munsell.py:2578:13-45: Argument `tuple[Any, Any, Any]` is not assignable to parameter `value` with type `tuple[float, float]` in function `list.index` [bad-argument-type]
+ ERROR colour/notation/munsell.py:2578:13-45: Argument `tuple[Any, Unknown, Any]` is not assignable to parameter `value` with type `tuple[float, float]` in function `list.index` [bad-argument-type]
+ ERROR colour/notation/tests/test_munsell.py:1734:32-80: Argument `list[_NestedSequence[bytes | complex | str] | bytes | complex | ndarray[tuple[Any, ...], dtype[float64 | floating[_16Bit] | floating[_32Bit]]] | str]` is not assignable to parameter `a` with type `Buffer | _NestedSequence[bytes | complex | str] | _NestedSequence[_SupportsArray[dtype]] | _SupportsArray[dtype] | bytes | complex | str` in function `colour.utilities.array.tstack` [bad-argument-type]
+ ERROR colour/notation/tests/test_munsell.py:1927:32-80: Argument `list[_NestedSequence[bytes | complex | str] | bytes | complex | ndarray[tuple[Any, ...], dtype[float64 | floating[_16Bit] | floating[_32Bit]]] | str]` is not assignable to parameter `a` with type `Buffer | _NestedSequence[bytes | complex | str] | _NestedSequence[_SupportsArray[dtype]] | _SupportsArray[dtype] | bytes | complex | str` in function `colour.utilities.array.tstack` [bad-argument-type]
+ ERROR colour/plotting/colorimetry.py:231:24-45: Argument `list[_NestedSequence[bytes | complex | str] | bytes | complex | ndarray[tuple[Any, ...], dtype[float64 | floating[_16Bit] | floating[_32Bit]]] | ndarray[tuple[Any, ...], dtype[Unknown]] | str]` is not assignable to parameter `a` with type `Buffer | _NestedSequence[bytes | complex | str] | _NestedSequence[_SupportsArray[dtype]] | _SupportsArray[dtype] | bytes | complex | str` in function `colour.utilities.array.tstack` [bad-argument-type]
- ERROR colour/plotting/diagrams.py:451:13-457:14: Argument `ndarray[tuple[int, int, int]]` is not assignable to parameter `segments` with type `Sequence[ArrayLike]` in function `matplotlib.collections.LineCollection.__init__` [bad-argument-type]
- ERROR colour/plotting/diagrams.py:465:13-56: Argument `ndarray[tuple[int, int, int]]` is not assignable to parameter `segments` with type `Sequence[ArrayLike]` in function `matplotlib.collections.LineCollection.__init__` [bad-argument-type]
- ERROR colour/plotting/models.py:419:13-425:14: Argument `ndarray[tuple[int, int, int]]` is not assignable to parameter `segments` with type `Sequence[ArrayLike]` in function `matplotlib.collections.LineCollection.__init__` [bad-argument-type]
- ERROR colour/plotting/temperature.py:233:9-238:10: Argument `ndarray[tuple[int, int, int]]` is not assignable to parameter `segments` with type `Sequence[ArrayLike]` in function `matplotlib.collections.LineCollection.__init__` [bad-argument-type]
- ERROR colour/plotting/temperature.py:479:13-484:14: Argument `ndarray[tuple[int, int, int]]` is not assignable to parameter `segments` with type `Sequence[ArrayLike]` in function `matplotlib.collections.LineCollection.__init__` [bad-argument-type]
- ERROR colour/plotting/temperature.py:500:17-506:18: Argument `ndarray[tuple[int, int, int], dtype[Unknown]]` is not assignable to parameter `segments` with type `Sequence[ArrayLike]` in function `matplotlib.collections.LineCollection.__init__` [bad-argument-type]
+ ERROR colour/plotting/tests/test_common.py:327:35-48: Object of class `Axes` has no attribute `get_zlim` [missing-attribute]
+ ERROR colour/plotting/volume.py:636:44-48: Argument `Axes` is not assignable to parameter `axes` with type `Axes3D | None` in function `nadir_grid` [bad-argument-type]
+ ERROR colour/plotting/volume.py:646:5-26: Object of class `Axes` has no attribute `add_collection3d` [missing-attribute]

poetry (https://github.com/python-poetry/poetry)
- ERROR src/poetry/console/commands/add.py:159:25-44: Object of class `NoneType` has no attribute `get` [missing-attribute]
- ERROR src/poetry/console/commands/add.py:170:17-50: `in` is not supported between `Literal['dependencies']` and `None` [not-iterable]
- ERROR src/poetry/console/commands/add.py:171:20-62: `in` is not supported between `Literal['optional-dependencies']` and `None` [not-iterable]
- ERROR src/poetry/console/commands/add.py:175:39-58: Object of class `NoneType` has no attribute `get` [missing-attribute]
- ERROR src/poetry/console/commands/add.py:179:39-58: Object of class `NoneType` has no attribute `get` [missing-attribute]
- ERROR src/poetry/console/commands/add.py:366:20-66: `not in` is not supported between `Literal['optional-dependencies']` and `None` [not-iterable]
- ERROR src/poetry/console/commands/add.py:367:21-61: Cannot set item in `None` [unsupported-operation]
- ERROR src/poetry/console/commands/add.py:368:36-76: `None` is not subscriptable [unsupported-operation]
- ERROR src/poetry/console/commands/add.py:369:21-61: `None` is not subscriptable [unsupported-operation]
- ERROR src/poetry/console/commands/add.py:370:18-55: `not in` is not supported between `Literal['dependencies']` and `None` [not-iterable]
- ERROR src/poetry/console/commands/add.py:371:17-48: Cannot set item in `None` [unsupported-operation]
- ERROR src/poetry/installation/executor.py:378:46-59: Object of class `NoneType` has no attribute `is_verbose` [missing-attribute]
- ERROR src/poetry/installation/executor.py:387:21-34: Object of class `NoneType` has no attribute `write_line` [missing-attribute]
- ERROR src/poetry/installation/executor.py:388:21-34: Object of class `NoneType` has no attribute `write_line` [missing-attribute]
- ERROR src/poetry/installation/executor.py:389:21-34: Object of class `NoneType` has no attribute `write_line` [missing-attribute]
- ERROR src/poetry/puzzle/solver.py:502:51-504:28: Object of class `NoneType` has no attribute `union` [missing-attribute]

paasta (https://github.com/yelp/paasta)
+ ERROR paasta_tools/contrib/rightsizer_soaconfigs_update.py:215:39-54: No matching overload found for function `min` called with arguments: (Literal[1], float) [no-matching-overload]
- ERROR paasta_tools/kubernetes_tools.py:1380:20-51: Object of class `NoneType` has no attribute `get` [missing-attribute]
- ERROR paasta_tools/kubernetes_tools.py:1381:23-54: Object of class `NoneType` has no attribute `get` [missing-attribute]
- ERROR paasta_tools/kubernetes_tools.py:1382:34-65: Object of class `NoneType` has no attribute `get` [missing-attribute]
- ERROR paasta_tools/kubernetes_tools.py:2393:13-39: Object of class `NoneType` has no attribute `pod_anti_affinity` [missing-attribute]

cibuildwheel (https://github.com/pypa/cibuildwheel)
- ERROR cibuildwheel/venv.py:29:19-43: `None` is not subscriptable [unsupported-operation]
- ERROR cibuildwheel/venv.py:30:15-35: `None` is not subscriptable [unsupported-operation]

streamlit (https://github.com/streamlit/streamlit)
- ERROR lib/streamlit/vendor/pympler/asizeof.py:1528:17-20: No matching overload found for function `iter` called with arguments: (object | tuple[()] | Any) [no-matching-overload]
+ ERROR lib/streamlit/vendor/pympler/asizeof.py:1528:17-20: No matching overload found for function `iter` called with arguments: (object | tuple[()] | Unknown) [no-matching-overload]
- ERROR lib/streamlit/vendor/pympler/asizeof.py:2107:22-41: Argument `object | tuple[()] | Any` is not assignable to parameter `iterable` with type `Iterable[@_]` in function `tuple.__new__` [bad-argument-type]
+ ERROR lib/streamlit/vendor/pympler/asizeof.py:2107:22-41: Argument `object | tuple[()] | Unknown` is not assignable to parameter `iterable` with type `Iterable[@_]` in function `tuple.__new__` [bad-argument-type]
- ERROR lib/streamlit/vendor/pympler/asizeof.py:2310:20-43: No matching overload found for function `sum` called with arguments: (object | tuple[()] | Any) [no-matching-overload]
+ ERROR lib/streamlit/vendor/pympler/asizeof.py:2310:20-43: No matching overload found for function `sum` called with arguments: (object | tuple[()] | Unknown) [no-matching-overload]
- ERROR lib/tests/streamlit/web/server/server_test_case.py:89:13-47: Argument `Literal[b'']` is not assignable to parameter `url` with type `HTTPRequest | str` in function `tornado.websocket.websocket_connect` [bad-argument-type]

hydra-zen (https://github.com/mit-ll-responsible-ai/hydra-zen)
-  INFO tests/annotations/declarations.py:995:16-56: revealed type: Any [reveal-type]
+  INFO tests/annotations/declarations.py:995:16-56: revealed type: Unknown [reveal-type]

archinstall (https://github.com/archlinux/archinstall)
- ERROR archinstall/lib/models/application.py:142:55-59: Argument `Any | None` is not assignable to parameter `value` with type `str` in function `enum.StrEnum.__new__` [bad-argument-type]

scikit-learn (https://github.com/scikit-learn/scikit-learn)
- ERROR sklearn/_loss/loss.py:1033:9-24: Class member `HalfMultinomialLoss.in_y_true_range` overrides parent class `BaseLoss` in an inconsistent manner [bad-override]
- ERROR sklearn/cluster/_affinity_propagation.py:533:9-537:10: Cannot unpack tuple[list[Unknown] | ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]] | ndarray[tuple[Any, ...], dtype[Unknown]]] | tuple[list[Unknown] | ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[Any, ...], dtype[signedinteger[_NBitIntP]]] | ndarray[tuple[Any, ...], dtype[Unknown]], int] | tuple[ndarray, ndarray] | tuple[ndarray, ndarray, Literal[0]] | tuple[Unknown, Unknown] | tuple[Unknown, Unknown, Literal[0]] (of size 2) into 3 values [bad-unpacking]
+ ERROR sklearn/cluster/_affinity_propagation.py:533:9-537:10: Cannot unpack tuple[list[Unknown] | ndarray[tuple[Any, ...], dtype[Unknown]], ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown] | tuple[list[Unknown] | ndarray[tuple[Any, ...], dtype[Unknown]], ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown, int] | tuple[ndarray, ndarray] | tuple[ndarray, ndarray, Literal[0]] | tuple[Unknown, Unknown] | tuple[Unknown, Unknown, Literal[0]] (of size 2) into 3 values [bad-unpacking]
- ERROR sklearn/cluster/_agglomerative.py:685:13-26: Object of class `ndarray` has no attribute `append` [missing-attribute]
- ERROR sklearn/cluster/_agglomerative.py:1084:32-47: `int | Unknown` is not assignable to attribute `n_clusters_` with type `signedinteger[_NBitIntP]` [bad-assignment]
- ERROR sklearn/cluster/_bicluster.py:613:48-85: No matching overload found for function `numpy.lib._shape_base_impl.apply_along_axis` called with arguments: ((v: Unknown) -> ndarray[tuple[int], dtype[float64]] | Unknown, axis=Literal[1], arr=Unknown) [no-matching-overload]
+ ERROR sklearn/cluster/_bicluster.py:613:48-85: No matching overload found for function `numpy.lib._shape_base_impl.apply_along_axis` called with arguments: ((v: Unknown) -> Unknown, axis=Literal[1], arr=Unknown) [no-matching-overload]
- ERROR sklearn/cluster/tests/test_hierarchical.py:88:9-49: Cannot unpack tuple[ndarray | Unknown, int, int | Any, None] | tuple[ndarray | Unknown, int, int | Any, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[int], dtype[float64]] | ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], Unknown] | Unknown (of size 5) into 4 values [bad-unpacking]
+ ERROR sklearn/cluster/tests/test_hierarchical.py:88:9-49: Cannot unpack tuple[ndarray | Unknown, int, Unknown, None] | tuple[ndarray | Unknown, int, Unknown, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown, Unknown] | Unknown (of size 5) into 4 values [bad-unpacking]
- ERROR sklearn/cluster/tests/test_hierarchical.py:119:21-56: Cannot unpack tuple[ndarray | Unknown, int, int | Any, None] | tuple[ndarray | Unknown, int, int | Any, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[int], dtype[float64]] | ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], Unknown] | Unknown (of size 5) into 4 values [bad-unpacking]
- ERROR sklearn/cluster/tests/test_hierarchical.py:133:9-44: Cannot unpack tuple[ndarray | Unknown, int, int | Any, None] | tuple[ndarray | Unknown, int, int | Any, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[int], dtype[float64]] | ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], Unknown] | Unknown (of size 5) into 4 values [bad-unpacking]
+ ERROR sklearn/cluster/tests/test_hierarchical.py:119:21-56: Cannot unpack tuple[ndarray | Unknown, int, Unknown, None] | tuple[ndarray | Unknown, int, Unknown, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown, Unknown] | Unknown (of size 5) into 4 values [bad-unpacking]
+ ERROR sklearn/cluster/tests/test_hierarchical.py:133:9-44: Cannot unpack tuple[ndarray | Unknown, int, Unknown, None] | tuple[ndarray | Unknown, int, Unknown, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown, Unknown] | Unknown (of size 5) into 4 values [bad-unpacking]
- ERROR sklearn/cluster/tests/test_hierarchical.py:352:13-37: Cannot unpack tuple[ndarray | Unknown, int, int | Any, None] | tuple[ndarray | Unknown, int, int | Any, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[int], dtype[float64]] | ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], Unknown] | Unknown (of size 5) into 4 values [bad-unpacking]
+ ERROR sklearn/cluster/tests/test_hierarchical.py:352:13-37: Cannot unpack tuple[ndarray | Unknown, int, Unknown, None] | tuple[ndarray | Unknown, int, Unknown, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown, Unknown] | Unknown (of size 5) into 4 values [bad-unpacking]
- ERROR sklearn/cluster/tests/test_hierarchical.py:385:5-29: Cannot unpack tuple[ndarray | Unknown, int, int | Any, None] | tuple[ndarray | Unknown, int, int | Any, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[int], dtype[float64]] | ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], Unknown] (of size 5) into 4 values [bad-unpacking]
+ ERROR sklearn/cluster/tests/test_hierarchical.py:385:5-29: Cannot unpack tuple[ndarray | Unknown, int, Unknown, None] | tuple[ndarray | Unknown, int, Unknown, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown, Unknown] (of size 5) into 4 values [bad-unpacking]
- ERROR sklearn/cluster/tests/test_hierarchical.py:751:9-60: Cannot unpack tuple[ndarray | Unknown, int, int | Any, None] | tuple[ndarray | Unknown, int, int | Any, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, int | Any, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], ndarray[tuple[int], dtype[float64]] | ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]] | tuple[Unknown, Unknown, Unknown, ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]], Unknown] | Unknown (of size 4) into 5 values [bad-unpacking]
+ ERROR sklearn/cluster/tests/test_hierarchical.py:751:9-60: Cannot unpack tuple[ndarray | Unknown, int, Unknown, None] | tuple[ndarray | Unknown, int, Unknown, None, ndarray[tuple[Any, ...], dtype[float64]] | Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown] | tuple[ndarray[tuple[Any, ...], dtype[Unknown]], int, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown] | tuple[Unknown, Unknown, Unknown, Unknown, Unknown] | Unknown (of size 4) into 5 values [bad-unpacking]
-  WARN sklearn/datasets/_base.py:1007:5-17: `numpy.ndarray.shape` is deprecated [deprecated]
-  WARN sklearn/datasets/_lfw.py:479:5-16: `numpy.ndarray.shape` is deprecated [deprecated]
- ], tuple[Unknown, ndarray[tuple[Any, ...], dtype[float64]], ndarray[tuple[int, int], dtype[float64]]]) [no-matching-overload]
+ ], tuple[Unknown, ndarray[tuple[Any, ...], dtype[float64]], Unknown]) [no-matching-overload]
- ERROR sklearn/decomposition/_lda.py:526:32-67: `float | ndarray[tuple[Any, ...], dtype[float64]] | Unknown` is not assignable to attribute `components_` with type `ndarray[tuple[Any, ...], dtype[float64]]` [bad-assignment]
- ERROR sklearn/decomposition/_nmf.py:1223:28-81: No matching overload found for function `numpy._core.numeric.full` called with arguments: (tuple[Unknown, str | Unknown], Any, dtype=Unknown) [no-matching-overload]
+ ERROR sklearn/decomposition/_nmf.py:1223:28-81: No matching overload found for function `numpy._core.numeric.full` called with arguments: (tuple[Unknown, str | Unknown], Unknown, dtype=Unknown) [no-matching-overload]
- ERROR sklearn/decomposition/_nmf.py:2045:20-74: No matching overload found for function `numpy._core.numeric.full` called with arguments: (tuple[Unknown, str | Unknown], Any, dtype=Unknown) [no-matching-overload]
+ ERROR sklearn/decomposition/_nmf.py:2045:20-74: No matching overload found for function `numpy._core.numeric.full` called with arguments: (tuple[Unknown, str | Unknown], Unknown, dtype=Unknown) [no-matching-overload]
- ERROR sklearn/dummy.py:308:28-314:18: No matching overload found for function `numpy.lib._shape_base_impl.tile` called with arguments: (list[ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown], list[Integral | int | Unknown]) [no-matching-overload]
+ ERROR sklearn/dummy.py:308:28-314:18: No matching overload found for function `numpy.lib._shape_base_impl.tile` called with arguments: (list[Unknown], list[Integral | int | Unknown]) [no-matching-overload]
- ERROR sklearn/ensemble/_gb.py:2192:9-14: Class member `GradientBoostingRegressor.apply` overrides parent class `BaseGradientBoosting` in an inconsistent manner [bad-override]
- ERROR sklearn/ensemble/_hist_gradient_boosting/tests/test_histogram.py:238:25-39: Cannot index into `ndarray[tuple[int, int], dtype[float64]]` [bad-index]
- ERROR sklearn/ensemble/_hist_gradient_boosting/tests/test_histogram.py:238:41-59: Cannot index into `ndarray[tuple[Any, ...], dtype[float64]]` [bad-index]
- ERROR sklearn/ensemble/_hist_gradient_boosting/tests/test_histogram.py:239:25-40: Cannot index into `ndarray[tuple[int, int], dtype[float64]]` [bad-index]
- ERROR sklearn/ensemble/_hist_gradient_boosting/tests/test_histogram.py:239:42-61: Cannot index into `ndarray[tuple[Any, ...], dtype[float64]]` [bad-index]
- ERROR sklearn/ensemble/_weight_boosting.py:669:13-23: Cannot index into `float` [bad-index]
- ERROR sklearn/ensemble/_weight_boosting.py:669:13-23: Cannot set item in `float` [unsupported-operation]
- ERROR sklearn/ensemble/_weight_boosting.py:670:20-28: Object of class `float` has no attribute `sum` [missing-attribute]
- ERROR sklearn/ensemble/tests/test_weight_boosting.py:70:12-42: Object of class `float` has no attribute `shape` [missing-attribute]
- ERROR sklearn/ensemble/tests/test_weight_boosting.py:91:12-50: Object of class `float` has no attribute `shape` [missing-attribute]
- ERROR sklearn/feature_selection/tests/test_base.py:70:28-45: Object of class `list` has no attribute `toarray`
+ ERROR sklearn/feature_selection/tests/test_base.py:70:28-45: Object of class `list` has no attribute `toarray` [missing-attribute]
- Object of class `ndarray` has no attribute `toarray` [missing-attribute]
+ ERROR sklearn/feature_selection/tests/test_feature_select.py:398:12-25: Object of class `SparseABC` has no attribute `shape` [missing-attribute]
+ ERROR sklearn/feature_selection/tests/test_feature_select.py:399:24-48: Cannot index into `SparseABC` [bad-index]
- ERROR sklearn/feature_selection/tests/test_base.py:106:30-49: Object of class `ndarray` has no attribute `toarray` [missing-attribute]
+ ERROR sklearn/feature_selection/tests/test_feature_select.py:401:12-23: Object of class `SparseABC` has no attribute `nnz` [missing-attribute]
- ERROR sklearn/feature_selection/tests/test_rfe.py:117:36-54: Object of class `list` has no attribute `toarray`
+ ERROR sklearn/feature_selection/tests/test_rfe.py:117:36-54: Object of class `list` has no attribute `toarray` [missing-attribute]
- Object of class `ndarray` has no attribute `toarray` [missing-attribute]
- ERROR sklearn/feature_selection/tests/test_rfe.py:211:24-42: Object of class `list` has no attribute `toarray`
+ ERROR sklearn/feature_selection/tests/test_rfe.py:211:24-42: Object of class `list` has no attribute `toarray` [missing-attribute]
- Object of class `ndarray` has no attribute `toarray` [missing-attribute]
- ERROR sklearn/feature_selection/tests/test_rfe.py:255:24-42: Object of class `list` has no attribute `toarray`
+ ERROR sklearn/feature_selection/tests/test_rfe.py:255:24-42: Object of class `list` has no attribute `toarray` [missing-attribute]
- Object of class `ndarray` has no attribute `toarray` [missing-attribute]
- ERROR sklearn/feature_selection/tests/test_rfe.py:262:24-42: Object of class `list` has no attribute `toarray`
+ ERROR sklearn/feature_selection/tests/test_rfe.py:262:24-42: Object of class `list` has no attribute `toarray` [missing-attribute]
- Object of class `ndarray` has no attribute `toarray` [missing-attribute]
- ERROR sklearn/gaussian_process/_gpc.py:288:31-55: `-` is not supported between `None` and `ndarray[tuple[Any, ...], dtype[float64]]` [unsupported-operation]
- ERROR sklearn/gaussian_process/_gpc.py:403:23-41: `-` is not supported between `None` and `ndarray[tuple[Any, ...], dtype[float64]]` [unsupported-operation]
- ERROR sklearn/gaussian_process/_gpc.py:436:36-60: `-` is not supported between `None` and `ndarray[tuple[Any, ...], dtype[float64]]` [unsupported-operation]
- ERROR sklearn/gaussian_process/_gpc.py:475:26-44: `-` is not supported between `None` and `ndarray[tuple[Any, ...], dtype[float64]]` [unsupported-operation]
- ERROR sklearn/gaussian_process/_gpr.py:530:57-62: Argument `ndarray[tuple[Any, ...], dtype[complex128 | Any]] | ndarray | ndarray[tuple[Any, ...], dtype[Unknown]] | tuple[Unknown, ndarray[tuple[Any, ...], dtype[Unknown]]] | Unknown` is not assignable to parameter `cov` with type `_NestedSequence[_SupportsArray[dtype[numpy.bool | floating | integer]]] | _NestedSequence[float] | _SupportsArray[dtype[numpy.bool | floating | integer]] | float` in function `numpy.random.mtrand.RandomState.multivariate_normal` [bad-argument-type]
+ ERROR sklearn/gaussian_process/_gpr.py:530:57-62: Argument `ndarray[tuple[Any, ...], dtype[complex128 | Any]] | ndarray[tuple[Any, ...], dtype[Unknown]] | tuple[Unknown, ndarray[tuple[Any, ...], dtype[Unknown]]] | Unknown` is not assignable to parameter `cov` with type `_NestedSequence[_SupportsArray[dtype[numpy.bool | floating | integer]]] | _NestedSequence[float] | _SupportsArray[dtype[numpy.bool | floating | integer]] | float` in function `numpy.random.mtrand.RandomState.multivariate_normal` [bad-argument-type]
- ERROR sklearn/gaussian_process/_gpr.py:534:40-58: Argument `ndarray[tuple[Any, ...], dtype[complex128 | Any]] | ndarray | ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown` is not assignable to parameter `cov` with type `_NestedSequence[_SupportsArray[dtype[numpy.bool | floating | integer]]] | _NestedSequence[float] | _SupportsArray[dtype[numpy.bool | floating | integer]] | float` in function `numpy.random.mtrand.RandomState.multivariate_normal` [bad-argument-type]
+ ERROR sklearn/gaussian_process/_gpr.py:534:40-58: Argument `ndarray[tuple[Any, ...], dtype[complex128 | Any]] | ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown` is not assignable to parameter `cov` with type `_NestedSequence[_SupportsArray[dtype[numpy.bool | floating | integer]]] | _NestedSequence[float] | _SupportsArray[dtype[numpy.bool | floating | integer]] | float` in function `numpy.random.mtrand.RandomState.multivariate_normal` [bad-argument-type]
- ERROR sklearn/gaussian_process/kernels.py:141:9-15: Class member `Hyperparameter.__eq__` overrides parent class `Hyperparameter` in an inconsistent manner [bad-override]
- ERROR sklearn/gaussian_process/kernels.py:639:9-15: Class member `CompoundKernel.__eq__` overrides parent class `Kernel` in an inconsistent manner [bad-override]
- ERROR sklearn/gaussian_process/kernels.py:646:9-22: Class member `CompoundKernel.is_stationary` overrides parent class `Kernel` in an inconsistent manner [bad-override]
- ERROR sklearn/gaussian_process/kernels.py:651:9-30: Class member `CompoundKernel.requires_vector_input` overrides parent class `Kernel` in an inconsistent manner [bad-override]
- ERROR sklearn/gaussian_process/kernels.py:1292:9-13: Class member `ConstantKernel.diag` overrides parent class `Kernel` in an inconsistent manner [bad-override]
- ERROR sklearn/gaussian_process/kernels.py:1415:9-13: Class member `WhiteKernel.diag` overrides parent class `Kernel` in an inconsistent manner [bad-override]
- ERROR sklearn/gaussian_process/tests/test_gpr.py:294:18-36: `*` is not supported between `tuple[Unknown, ndarray[tuple[Any, ...], dtype[Unknown]]]` and `floating` [unsupported-operation]
- ERROR sklearn/gaussian_process/tests/test_gpr.py:301:13-29: `*` is not supported between `tuple[Unknown, ndarray[tuple[Any, ...], dtype[Unknown]]]` and `floating` [unsupported-operation]
- ERROR sklearn/gaussian_process/tests/test_kernels.py:195:10-28: `*` is not supported between `float` and `tuple[Unknown, ndarray[tuple[int, int, int], dtype[float64]]]` [unsupported-operation]
- ERROR sklearn/gaussian_process/tests/test_kernels.py:200:10-28: `*` is not supported between `float` and `tuple[Unknown, ndarray[tuple[int, int, int], dtype[float64]]]` [unsupported-operation]
- ERROR sklearn/gaussian_process/tests/test_kernels.py:308:34-39: Argument `tuple[Unknown, ndarray[tuple[int, int, int], dtype[float64]]] | tuple[Unknown, Unknown] | Unknown | None` is not assignable to parameter `desired` with type `_NestedSequence[_SupportsArray[dtype[numpy.bool | number]]] | _NestedSequence[_SupportsArray[dtype[object_]]] | _NestedSequence[complex] | _SupportsArray[dtype[numpy.bool | number]] | _SupportsArray[dtype[object_]] | complex` in function `numpy.testing._private.utils.assert_array_almost_equal` [bad-argument-type]
- ERROR sklearn/gaussian_process/tests/test_kernels.py:322:35-37: Argument `tuple[Unknown, ndarray[tuple[int, int, int], dtype[float64]]] | tuple[Unknown, Unknown] | Unknown | None` is not assignable to parameter `desired` with type `_NestedSequence[_SupportsArray[dtype[numpy.bool | number]]] | _NestedSequence[_SupportsArray[dtype[object_]]] | _NestedSequence[complex] | _SupportsArray[dtype[numpy.bool | number]] | _SupportsArray[dtype[object_]] | complex` in function `numpy.testing._private.utils.assert_array_almost_equal` [bad-argument-type]
+ ERROR sklearn/gaussian_process/tests/test_kernels.py:308:34-39: Argument `tuple[Unknown, Unknown] | Unknown | None` is not assignable to parameter `desired` with type `_NestedSequence[_SupportsArray[dtype[numpy.bool | number]]] | _NestedSequence[_SupportsArray[dtype[object_]]] | _NestedSequence[complex] | _SupportsArray[dtype[numpy.bool | number]] | _SupportsArray[dtype[object_]] | complex` in function `numpy.testing._private.utils.assert_array_almost_equal` [bad-argument-type]
+ ERROR sklearn/gaussian_process/tests/test_kernels.py:322:35-37: Argument `tuple[Unknown, Unknown] | Unknown | None` is not assignable to parameter `desired` with type `_NestedSequence[_SupportsArray[dtype[numpy.bool | number]]] | _NestedSequence[_SupportsArray[dtype[object_]]] | _NestedSequence[complex] | _SupportsArray[dtype[numpy.bool | number]] | _SupportsArray[dtype[object_]] | complex` in function `numpy.testing._private.utils.assert_array_almost_equal` [bad-argument-type]
- ERROR sklearn/impute/_base.py:663:27-665:24: Cannot index into `ndarray[tuple[int], dtype[signedinteger[_NBitIntP]]]` [bad-index]
- ERROR sklearn/impute/_knn.py:211:29-68: No matching overload found for function `numpy.lib._function_base_impl.average` called with arguments: (MaskedArray[tuple[Any, ...], dtype[Unknown]], axis=Literal[1], weights=object | Unknown) [no-matching-overload]
+ ERROR sklearn/impute/_knn.py:211:29-68: No matching overload found for function `numpy.lib._function_base_impl.average` called with arguments: (Unknown, axis=Literal[1], weights=object | Unknown) [no-matching-overload]
- ERROR sklearn/linear_model/_glm/_newton_solver.py:185:42-198:10: No matching overload found for function `scipy.optimize._minimize.minimize` called with arguments: ((self: LinearModelLoss, coef: Unknown, X: Unknown, y: Unknown, sample_weight: Unknown | None = None, l2_reg_strength: float | Unknown = ..., n_threads: int | Unknown = 1, raw_prediction: Unknown | None = None) -> tuple[float, ndarray[tuple[int], dtype[float64]] | ndarray[tuple[int, int], dtype[float64]] | Unknown] | Unknown, Unknown, method=Literal['L-BFGS-B'], jac=Literal[True], options=dict[str | Unknown, float | float64 | int | Unknown], args=tuple[Unknown, Unknown, Unknown, float | Unknown, int | Unknown]) [no-matching-overload]
+ ERROR sklearn/linear_model/_glm/_newton_solver.py:185:42-198:10: No matching overload found for function `scipy.optimize._minimize.minimize` called with arguments: ((self: LinearModelLoss, coef: Unknown, X: Unknown, y: Unknown, sample_weight: Unknown | None = None, l2_reg_strength: float | Unknown = ..., n_threads: int | Unknown = 1, raw_prediction: Unknown | None = None) -> tuple[float, Unknown] | Unknown, Unknown, method=Literal['L-BFGS-B'], jac=Literal[True], options=dict[str | Unknown, float | float64 | int | Unknown], args=tuple[Unknown, Unknown, Unknown, float | Unknown, int | Unknown]) [no-matching-overload]
- ERROR sklearn/linear_model/_glm/glm.py:265:46-281:14: No matching overload found for function `scipy.optimize._minimize.minimize` called with arguments: ((self: LinearModelLoss, coef: Unknown, X: Unknown, y: Unknown, sample_weight: Unknown | None = None, l2_reg_strength: float | Unknown = ..., n_threads: int | Unknown = 1, raw_prediction: Unknown | None = None) -> tuple[float, ndarray[tuple[int], dtype[float64]] | ndarray[tuple[int, int], dtype[float64]] | Unknown], ndarray[tuple[Any, ...], dtype[Unknown]] | Unknown, method=Literal['L-BFGS-B'], jac=Literal[True], options=dict[str | Unknown, float | float64 | int | Unknown], args=tuple[str | Unknown, Unknown, Unknown | None, float | Unknown, Unknown]) [no-matching-overload]

... (truncated 129 lines) ...

optuna (https://github.com/optuna/optuna)
- ERROR optuna/_gp/search_space.py:109:42-113:22: No matching overload found for function `numpy._core.multiarray.arange` called with arguments: (Any, Unknown, ndarray) [no-matching-overload]
- ERROR optuna/_gp/search_space.py:114:43-63: Argument `ndarray[tuple[Any, ...], dtype[signedinteger[_64Bit]]]` is not assignable to parameter `value` with type `int` in function `enum.IntEnum.__new__` [bad-argument-type]
- ERROR optuna/_gp/search_space.py:116:26-40: Argument `ndarray` is not assignable to parameter `step` with type `float` in function `_normalize_one_param` [bad-argument-type]
- ERROR optuna/_hypervolume/hssp.py:95:26-52: No matching overload found for function `max` called with arguments: (ndarray, float | Unknown) [no-matching-overload]
- ERROR optuna/_transform.py:160:55-75: Argument `signedinteger[_NBitIntP] | Unknown` is not assignable to parameter `param_value_in_internal_repr` with type `float` in function `optuna.distributions.CategoricalDistribution.to_external_repr` [bad-argument-type]
- ERROR optuna/testing/storages.py:122:44-124:14: `Any | None` is not assignable to attribute `_redis` with type `Redis[bytes]` [bad-assignment]

... (truncated 243 lines) ...```

@github-actions
Copy link

Primer Diff Classification

❌ 16 regression(s) | ✅ 37 improvement(s) | ➖ 3 neutral | 56 project(s) total

16 regression(s) across kornia, scrapy, pandas, pip, tornado, scipy-stubs, materialize, core, trio, scipy, colour, streamlit, scikit-learn, spark, aiortc, rich. error kinds: no-matching-overload, bad-argument-type, unsupported-operation. caused by call_overload(), is_consistent(), is_equivalent(), overload_resolution(), try_call_overload(). 37 improvement(s) across spack, parso, dd-trace-py, pydantic, freqtrade, ibis, aioredis, comtypes, starlette, zulip, hydpy, egglog-python, porcupine, cwltool, aiohttp, mkosi, mitmproxy, poetry, paasta, cibuildwheel, archinstall, optuna, bandersnatch, kopf, static-frame, psycopg, schemathesis, steam.py, pytest, openlibrary, black, cloud-init, jax, sphinx, xarray, rotki, discord.py.

Project Verdict Changes Error Kinds Root Cause
spack ✅ Improvement -18 bad-argument-type, bad-assignment call_overload()
parso ✅ Improvement +1, -3 bad-index, missing-attribute is_equivalent()
kornia ❌ Regression +1 no-matching-overload is_consistent()
beartype ➖ Neutral +1, -1 bad-argument-type
dd-trace-py ✅ Improvement -11 bad-argument-type, bad-return overload_resolution()
pydantic ✅ Improvement -1 unsupported-operation is_equivalent()
freqtrade ✅ Improvement -8 bad-return, bad-typed-dict-key overload_resolution()
ibis ✅ Improvement +1, -2 bad-return, not-callable is_consistent()
aioredis ✅ Improvement -1 unsupported-operation call_overload()
scrapy ❌ Regression +1, -5 bad-argument-type, no-matching-overload is_consistent()
comtypes ✅ Improvement -1 missing-attribute is_consistent()
starlette ✅ Improvement -2 missing-attribute, unsupported-operation is_consistent()
zulip ✅ Improvement -3 bad-argument-type overload_resolution()
hydpy ✅ Improvement -1 bad-argument-type is_consistent()
pandas ❌ Regression +236, -494 bad-argument-type, bad-assignment call_overload()
pip ❌ Regression +1, -2 bad-param-name-override, bad-return is_equivalent()
tornado ❌ Regression +2 no-matching-overload is_equivalent()
egglog-python ✅ Improvement -6 missing-attribute is_consistent()
porcupine ✅ Improvement -1 bad-return try_call_overload()
cwltool ✅ Improvement -2 bad-assignment, unsupported-operation is_equivalent()
scipy-stubs ❌ Regression +65 assert-type is_consistent()
materialize ❌ Regression -2 bad-argument-type, unsupported-operation is_equivalent()
aiohttp ✅ Improvement -1 bad-argument-type is_equivalent()
core ❌ Regression +13, -35 bad-argument-type, bad-assignment is_equivalent()
mkosi ✅ Improvement -2 bad-argument-type pyrefly/lib/alt/overload.rs
trio ❌ Regression +1, -1 no-matching-overload try_call_overload()
mitmproxy ✅ Improvement -16 bad-return, no-matching-overload is_consistent()
scipy ❌ Regression +4, -3 bad-index, not-callable call_overload()
colour ❌ Regression +11, -12 bad-argument-type, missing-attribute call_overload()
poetry ✅ Improvement -16 missing-attribute, not-iterable is_equivalent()
paasta ✅ Improvement +1, -4 missing-attribute, no-matching-overload call_overload()
cibuildwheel ✅ Improvement -2 unsupported-operation is_consistent()
streamlit ❌ Regression +3, -4 no-matching-overload is_consistent()
hydra-zen ➖ Neutral +1, -1 reveal-type
archinstall ✅ Improvement -1 bad-argument-type is_equivalent()
scikit-learn ❌ Regression +84, -115 no-matching-overload on unknown overload_resolution()
optuna ✅ Improvement -7 bad-argument-type, bad-assignment is_consistent()
spark ❌ Regression +1, -5 new-false-positive is_consistent()
bandersnatch ✅ Improvement -1 bad-argument-type is_consistent()
kopf ✅ Improvement -4 bad-argument-type, no-matching-overload is_equivalent()
static-frame ✅ Improvement -14 invalid-yield, missing-attribute overload_resolution()
psycopg ✅ Improvement -1 bad-argument-type is_consistent()
schemathesis ✅ Improvement -1 bad-return is_equivalent()
steam.py ✅ Improvement -1 bad-argument-type is_consistent()
pytest ✅ Improvement -1 no-matching-overload is_equivalent()
openlibrary ✅ Improvement -46 bad-argument-type, bad-index is_equivalent()
black ✅ Improvement -1 bad-argument-type is_consistent()
meson ➖ Neutral +8, -10 bad-argument-type, bad-return
aiortc ❌ Regression +2 no-matching-overload is_equivalent()
rich ❌ Regression +1 no-matching-overload is_equivalent()
cloud-init ✅ Improvement +1, -19 bad-argument-type, bad-override is_equivalent()
jax ✅ Improvement -11 bad-argument-type, missing-attribute is_consistent()
sphinx ✅ Improvement -1 unsupported-operation is_equivalent()
xarray ✅ Improvement +6, -39 bad-argument-type, bad-index overload_resolution()
rotki ✅ Improvement -2 bad-argument-type, no-matching-overload pyrefly/lib/alt/overload.rs
discord.py ✅ Improvement +1, -11 bad-argument-type, bad-assignment is_equivalent()
Detailed analysis

❌ Regression (16)

kornia (+1)

This is a regression. The code max(0, self.p - self.adjustment_speed) where self.p and self.adjustment_speed are floats is perfectly valid Python - integers and floats are comparable and max() should handle this. The PR made overload resolution overly strict by changing from is_consistent to is_equivalent checks, causing it to reject a common and correct Python pattern that mypy/pyright would accept. The test case in the PR diff even shows this exact pattern being flagged as an error, confirming the stricter behavior is intentional but too restrictive.
Attribution: The change from is_consistent() to is_equivalent() in pyrefly/lib/alt/overload.rs made overload resolution stricter, causing it to reject the max(Literal[0], float) call that was previously accepted.

scrapy (+1, -5)

This appears to be a regression. The new error flags max(0, float_value) which is a common, valid pattern that mypy/pyright handle correctly. The PR's own test case shows this should work. The removed errors involved 'Unknown' and 'Any' types suggesting they were inference failures rather than real bugs. The stricter overload resolution is creating false positives while removing what were likely already false positives.
Attribution: The change to overload resolution logic in pyrefly/lib/alt/overload.rs, specifically the switch from is_consistent() to is_equivalent() for return type checking and the bidirectional variadic parameter elimination, caused these changes. The new stricter equivalence check is likely causing max() overloads to be rejected incorrectly.

pandas (+236, -494)

This represents a regression. While the PR aims for spec compliance, it has made overload resolution overly strict, causing widespread inference failures in pandas. The removal of contextual hints during overload selection breaks legitimate type inference patterns that pandas relies on. The new errors show Unknown types proliferating through the codebase, indicating the type checker is failing to resolve types that were previously inferred correctly. The removed errors were mostly false positives about missing attributes that actually exist. The net effect is a significant degradation in type checking quality for pandas code.
Attribution: The changes to call_overload() in pyrefly/lib/alt/overload.rs that removed hint usage during initial overload selection (line 488: None, // don't use the hint yet) and modified variadic parameter elimination logic caused these inference failures.

pip (+1, -2)

The new error for max(0, x) is a regression - pyrefly is incorrectly rejecting valid Python code that works at runtime and is accepted by other type checkers. The max function should handle Literal[0] and float arguments without issue. The removed errors were false positives that pyrefly correctly eliminated - the bad-return error was an inference failure, and the bad-param-name-override was incorrectly flagging valid metaclass method overrides in Pygments.
Attribution: The change to overload resolution in pyrefly/lib/alt/overload.rs made overload selection stricter. The modification to is_equivalent() instead of is_consistent() and changes to how variadic parameters are handled caused the new max(0, x) error, while fixing the overload resolution logic removed the false positive errors.

tornado (+2)

These are false positive errors. The code max(0, x) where x is float is perfectly valid Python that works at runtime. The max function accepts mixed numeric types due to Python's numeric tower - int and float are compatible. The PR change from is_consistent() to is_equivalent() in overload resolution appears to have made pyrefly overly strict about return type matching, causing it to incorrectly reject valid numeric type combinations that mypy/pyright would accept. This represents a regression where pyrefly is now flagging correct code as erroneous.
Attribution: The change to is_equivalent() from is_consistent() in pyrefly/lib/alt/overload.rs line 590 appears to be the cause. The PR description mentions 'properly check for equivalent return types when determining whether overload resolution is ambiguous', suggesting this change made overload resolution stricter about return type equivalence. This likely caused previously-matching overloads for max(Literal[0], float) to be rejected as ambiguous.

scipy-stubs (+65)

This is a regression. The change from is_consistent to is_equivalent in overload resolution is too strict, causing pyrefly to fail to resolve overloads that mypy/pyright handle correctly. In scipy-stubs, mathematical functions like clarkson_woodruff_transform(arr_any, 2) should return onp.ArrayND[Any] but pyrefly now infers Unknown, indicating overload resolution failure. Since this is a well-tested stubs project that works with other type checkers, these Unknown types represent false positives where pyrefly is being overly strict about overload ambiguity.
Attribution: The change from is_consistent() to is_equivalent() in pyrefly/lib/alt/overload.rs line 590 made overload resolution more strict. When multiple overloads have different but consistent return types, the new code treats this as ambiguous and returns Any (appearing as Unknown), whereas the old code would pick the first matching overload.

materialize (-2)

Both removed errors were catching genuine type bugs in the code. The first error correctly identified that asyncio.gather(*[setup(i, git_rev) for ...]) returns Future[list[None]] (since setup() returns None), but run_until_complete() expects Awaitable[tuple[None]] - this is a real type mismatch. The second error correctly identified that line 336 attempts (child.get('messages_total', child.get('rows', 0))) * factor_initial_data where the first child.get() can return None, making this None * float which is invalid. The PR change from is_consistent() to is_equivalent() in overload resolution made pyrefly stricter about type matching, but removing detection of these genuine bugs represents a loss of capability.
Attribution: The change to is_equivalent() vs is_consistent() in pyrefly/lib/alt/overload.rs line 590 made overload resolution stricter about return type equivalence. This likely caused pyrefly to stop accepting these type mismatches that it previously allowed through looser consistency checking.

core (+13, -35)

This is a regression. The PR made overload resolution stricter by changing from 'consistency' to 'equivalence' checking and modifying variadic parameter elimination logic. This caused new false positive errors on common, valid patterns like max(0, volume) where volume is a float. The typing spec's overload resolution rules are meant to handle real ambiguity, not reject standard library usage patterns that work perfectly at runtime. While some removed errors may have been legitimate (the bad-argument-type cases), the new errors represent pyrefly becoming overly strict compared to ecosystem standards. The net effect is worse - introducing false positives on fundamental operations like max() is more problematic than the mixed quality of removed errors.
Attribution: The change to is_equivalent() from is_consistent() in pyrefly/lib/alt/overload.rs line 590 made overload resolution stricter. The modification to eliminate overloads with variadic parameters when fixed arguments are provided (lines 507-540) also changed which overloads are selected. These changes caused new no-matching-overload errors for previously accepted patterns like max(0, float).

trio (+1, -1)

The new error shows a type inference failure where count is inferred as Never instead of int after an isinstance() check that should narrow int | float to int. This is a regression - pyrefly lost the ability to properly narrow union types. The removed error was a false positive about range(float | int) when the value was actually guaranteed to be int after the min() call. While removing the false positive is good, introducing the Never inference failure is worse overall.
Attribution: The change to overload resolution logic in pyrefly/lib/alt/overload.rs, specifically switching from try_call_overload() to call_overload() and changing from is_consistent() to is_equivalent() for return type checking, appears to have broken type inference in union type narrowing scenarios, causing count to be inferred as Never instead of int

scipy (+4, -3)

The new errors are false positives where pyrefly incorrectly flags valid numpy operations. Line 232 shows iact[[icon, nact-1]] = iact[[nact-1, icon]] which is standard numpy fancy indexing to swap array elements - both sides are integer arrays and this operation is well-defined. The removed errors were also false positives where pyrefly incorrectly inferred function return types as None instead of recognizing valid callable returns. The overload resolution changes in the PR fixed the inference failures that caused the false not-callable errors, but introduced new false positives in numpy array type checking.
Attribution: The change to call_overload() in pyrefly/lib/alt/overload.rs modified how overload resolution works, particularly around hint usage and error handling. The removal of the retry logic in try_call_overload() (now call_overload()) likely fixed cases where pyrefly was incorrectly inferring None types due to failed overload resolution, which caused the false positive not-callable errors.

colour (+11, -12)

This appears to be a regression. While the PR implements more spec-compliant overload resolution, it's producing false positives on well-established scientific computing code. The new missing-attribute errors on matplotlib Axes objects are particularly concerning as these are standard methods. The bad-argument-type errors on tstack() calls suggest the stricter overload selection is choosing less compatible overloads than before. The removed errors may have included some false positives, but the new errors appear to be predominantly false positives on working code.
Attribution: The changes to call_overload() in pyrefly/lib/alt/overload.rs modified how overloads are selected and when hints are applied. The switch from is_consistent() to is_equivalent() and the improved variadic parameter elimination logic caused different overloads to be selected, leading to different type inference results.

streamlit (+3, -4)

no-matching-overload: These are false positives caused by stricter overload resolution. The calls to iter() and tuple() are valid but pyrefly now rejects them due to the is_equivalent check being too strict.
bad-argument-type: This is a cascade error from the overload resolution failure. The Unknown type inference failure makes pyrefly think the argument is incompatible with Iterable[@_].

Overall: This is a REGRESSION. The PR made overload resolution stricter by changing from is_consistent to is_equivalent when checking if multiple matching overloads are ambiguous. This causes pyrefly to reject valid calls to built-in functions like iter() and tuple() that mypy/pyright would accept. The appearance of Unknown and @_ types in the error messages indicates type inference failures rather than real bugs in the code. The affected calls (iter(t) and tuple(_keys(...))) are standard Python patterns that should work fine.

Attribution: The change from is_consistent() to is_equivalent() in pyrefly/lib/alt/overload.rs at line 590 appears to be the root cause. This makes overload resolution stricter by requiring equivalent (identical) return types rather than just consistent ones. The elimination of variadic parameter logic changes may also contribute by affecting which overloads are considered.

scikit-learn (+84, -115)

no-matching-overload on unknown: These are false positives caused by overly strict overload matching. Functions like numpy.lib._shape_base_impl.apply_along_axis have valid overloads that should match the provided arguments, but pyrefly's new variadic elimination rules incorrectly reject them.
bad-argument-type on unknown: These appear to be cascade errors from the overload matching issues. When pyrefly can't find a matching overload, it falls back to less precise type information, leading to argument type mismatches.
bad-unpacking on unknown: Similar cascade errors - when function return types become imprecise due to overload matching failures, tuple unpacking operations fail type checking.
bad-index on unknown: More cascade errors from imprecise type inference following overload matching failures.

Overall: This is a regression. The PR makes pyrefly stricter than the typing spec requires and stricter than established type checkers. While the PR description mentions 'properly check for equivalent return types', the actual changes go beyond spec compliance by adding non-spec-compliant variadic elimination rules. The new errors are false positives - sklearn's numpy function calls work correctly at runtime, but pyrefly's overly strict overload matching now rejects valid patterns. The removed errors were mostly inference failures that got fixed as a side effect, which is positive, but the new false positives outweigh this benefit.

Attribution: The changes to overload_resolution() in pyrefly/lib/alt/overload.rs caused these errors. The PR modified overload selection logic in two key ways: (1) it now eliminates variadic overloads when fixed arguments are supplied (lines 507-540), and (2) it changed from checking if overloads are 'consistent' to checking if they are 'equivalent' (line 590). The first change is causing new no-matching-overload errors by being more restrictive about which overloads match. The second change may be removing some ambiguity errors that were previously flagged.

spark (+1, -5)

new-false-positive: min(Literal[1], float) is valid Python that should not be flagged
removed-false-positives: The removed errors were incorrectly flagging valid NumPy and pandas operations

Overall: This is a REGRESSION. The new error incorrectly flags min(1, float_value) as having no matching overload, when this is perfectly valid Python code that mypy/pyright accept. While the PR removes some false positives (improvement), it introduces a new false positive that makes pyrefly stricter than the ecosystem standard. The net effect is negative because it breaks valid, commonly-used Python patterns like min(literal_int, float_expression).

Attribution: The new error is caused by changes to overload resolution in pyrefly/lib/alt/overload.rs, specifically the modification to step 4 that eliminates overloads with variadic parameters when a fixed number of arguments is supplied. The removed errors were likely eliminated by the switch from is_consistent() to is_equivalent() for ambiguity detection and the simplified overload calling logic that removed retry mechanisms.

aiortc (+2)

These are false positives caused by overly strict overload resolution. The max() function has multiple overloads to handle different numeric type combinations, and max(6, some_float) should successfully match one of them. The PR changed overload ambiguity detection from is_consistent() to is_equivalent(), making pyrefly reject valid calls that mypy/pyright accept. The calls max(6, threshold) and max(1, rtt_calculation) are standard Python patterns that work at runtime and should type-check successfully. This represents pyrefly being more restrictive than the established ecosystem standard.
Attribution: The change to is_equivalent() from is_consistent() in pyrefly/lib/alt/overload.rs line 590 made overload resolution stricter. The new logic requires return types to be equivalent rather than just consistent, causing max(Literal[6], float) to fail overload resolution because different overloads return slightly different types (e.g., int vs float vs int | float).

rich (+1)

Looking at line 677, the error occurs on max(0, task.completed) where task.completed is a float (from the Task dataclass definition at line 952). The built-in max function has overloads that should handle max(int, float) -> float, but pyrefly's stricter overload resolution is now rejecting this common pattern. The PR changed from is_consistent to is_equivalent when checking if multiple matching overloads have compatible return types, making the type checker more strict about what constitutes 'equivalent' return types. Since mypy and pyright both accept max(0, x) where x is float, and this is a fundamental numeric operation that works at runtime, this represents pyrefly being overly strict compared to ecosystem standards. This is a regression - the type checker is now flagging correct code that other tools accept.
Attribution: The change to is_equivalent() in pyrefly/lib/alt/overload.rs at line 590 made overload resolution stricter when checking if multiple matching overloads have compatible return types. This caused pyrefly to reject the max(0, task.completed) call where it previously would have accepted it.

✅ Improvement (37)

spack (-18)

This is an improvement. The PR fixed overload resolution logic that was causing type inference failures, leading to false positive errors. The removed errors were not catching real bugs - they were artifacts of the type checker's inability to properly resolve overloads and infer types. For example, the missing-attribute errors on NoneType claiming core Spack attributes don't exist are clearly wrong for a mature, well-tested project. The no-matching-overload errors for basic posixpath.join calls with string arguments were also incorrect. The PR's changes to overload resolution (switching to is_equivalent() and better handling of variadic parameters) fixed these inference cascades.
Attribution: The changes to call_overload() and overload resolution logic in pyrefly/lib/alt/overload.rs improved type inference accuracy. The switch from is_consistent() to is_equivalent() for comparing overload return types and the elimination of variadic parameter overloads when fixed arguments are provided reduced false positive overload resolution failures, which cascaded to fix the downstream type inference errors.

parso (+1, -3)

The new bad-index error correctly identifies a real bug: on line 588, token[1] accesses index 1 of a string that could have only 1 character (when initial is a single character). Looking at the context, token comes from pseudomatch.group(2) and initial = token[0], so if token has length 1, accessing token[1] will raise IndexError. The removed errors were false positives - the regex match object definitely has .group() and .end() methods when the match succeeds, which it should given the regex design.
Attribution: The change to is_equivalent() from is_consistent() in pyrefly/lib/alt/overload.rs at line 590 made overload resolution stricter about return type compatibility. This likely improved type inference precision, allowing pyrefly to detect the genuine index bounds violation in the tokenizer code that was previously masked by looser overload matching.

dd-trace-py (-11)

The removed errors were false positives caused by overload resolution inference failures. The PR fixed pyrefly's overload resolution logic to properly infer concrete types instead of incorrectly producing Never/Unknown types. The specific changes - using is_equivalent instead of is_consistent for return type checking and deferring contextual typing until after overload selection - resolved the inference issues that were causing pyrefly to report missing attributes on NoneType and type mismatches with Unknown types. This is an improvement because pyrefly is now correctly inferring types where it previously failed.
Attribution: The changes to overload_resolution() in pyrefly/lib/alt/overload.rs fixed the inference failures by: (1) using proper equivalence checking instead of consistency checking, (2) deferring hint application until after overload selection, and (3) improving variadic parameter elimination logic

pydantic (-1)

The removed error was a false positive. The code getattr(tp, '__mro__', tp.__class__.__mro__)[:-1] on line 320 is valid Python. When tp is None, the expression evaluates to type(None).__mro__[:-1], where type(None).__mro__ is a tuple (which is subscriptable), not None itself. The error incorrectly claimed that None was being subscripted, but the actual subscripting operation is performed on the tuple returned by __mro__. The PR's improvement to overload resolution logic fixed this inference error.
Attribution: The change to is_equivalent() in pyrefly/lib/alt/overload.rs (replacing is_consistent()) improved overload resolution by properly checking return type equivalence. This fixed the false positive where pyrefly incorrectly inferred that tp could be None in the context where subscripting occurs, when in fact the subscripting operation is on the __mro__ tuple, not on tp directly.

freqtrade (-8)

The PR improves overload resolution to better match mypy/pyright behavior, which removed several false positive errors. Most removed errors appear to be inference failures (types resolving to Unknown/Never incorrectly) or overly strict overload matching. However, the missing-attribute error claiming list has no update method appears to be a genuine issue - lists don't have update(), only dicts do. The other removals represent improvements in type inference accuracy.
Attribution: The changes to overload_resolution() in pyrefly/lib/alt/overload.rs improved overload selection by: 1) extending step 4 to eliminate variadic overloads when fixed arguments are supplied (aligning with mypy/pyright), 2) changing from is_consistent() to is_equivalent() for return type comparison in step 5, and 3) applying contextual typing hints after overload selection rather than during selection to avoid false negatives.

ibis (+1, -2)

The new error correctly identifies a real bug in the source code: _coerce_to_dict function has an incorrect return type annotation. Looking at line 38, the function returns dict(zip(output_type.names, data)) which creates a dictionary, but the function is annotated as returning tuple. This is a genuine type annotation error that should be fixed. The removed errors were false positives: the not-callable error incorrectly flagged a valid getattr pattern, and the previous bad-return error contained @_ types indicating type inference failure. The PR improved overload resolution accuracy, catching a real annotation bug while removing false positives.
Attribution: The changes to overload resolution in pyrefly/lib/alt/overload.rs modified how overloads are selected and how return type equivalence is checked. The change from is_consistent() to is_equivalent() on line 590 likely made return type checking stricter, causing the new bad-return error to appear. The refactoring of try_call_overload() to call_overload() and changes to hint handling may have fixed the false positive not-callable error by improving type inference.

aioredis (-1)

The removed error was a false positive. Line 164 shows response[field] = int(response[field]) where response is clearly a dict[str, str] (created on line 157 via dict(kv.split(":") for kv in response.split())). Dict item assignment is a fundamental operation - dict[str, str] absolutely supports __setitem__. The error claiming this was an 'unsupported-operation' was incorrect. The PR's improvements to overload resolution fixed this inference failure, allowing pyrefly to properly recognize dict assignment operations.
Attribution: The change to overload resolution logic in pyrefly/lib/alt/overload.rs fixed the false positive. The modifications to call_overload() and the elimination logic for variadic parameters improved type inference accuracy, allowing pyrefly to correctly recognize that dict item assignment is supported.

comtypes (-1)

This is an improvement. The error was a false positive caused by poor overload resolution. The cast(self, POINTER(itf)) call should return a COM interface pointer that has QueryInterface - this is fundamental COM interop behavior. The PR fixed the overload resolution logic to properly determine equivalent return types, eliminating the false positive missing-attribute error.
Attribution: The change from is_consistent() to is_equivalent() in pyrefly/lib/alt/overload.rs line 590 improved overload resolution for the cast() function, allowing it to return the correct COM interface pointer type that includes the QueryInterface method.

starlette (-2)

These were false positive errors. The code uses nested dict.get() calls: base_url_scope.get('app_root_path', base_url_scope.get('root_path', '')). Since the innermost call has a string default (''), the entire expression is guaranteed to return a string, never None. The errors incorrectly claimed path could be None, but the type flow analysis shows it's always a string. Removing these false positives is an improvement in type inference accuracy.
Attribution: The change from is_consistent() to is_equivalent() in pyrefly/lib/alt/overload.rs and the improved overload resolution logic fixed the type inference for dict.get() methods, allowing pyrefly to correctly determine that dict.get() with a string default returns str, not str | None.

zulip (-3)

These are improvements. The PR fixes overload resolution to be more spec-compliant per the typing spec. The removed errors were false positives caused by overly strict overload selection. The code patterns (using .get() with defaults, context managers, Django functions) are standard and should work correctly. The PR's changes to use is_equivalent() instead of is_consistent() for return type checking and improve overload selection logic resolved cases where pyrefly was incorrectly rejecting valid overloads, making the type checker more accurate.
Attribution: The changes to overload_resolution() in pyrefly/lib/alt/overload.rs modified how overloads are selected and how ambiguity is determined. Specifically, changing from is_consistent() to is_equivalent() for return type checking and improving the overload selection logic likely resolved cases where valid overloads were being rejected, removing these false positive type errors.

hydpy (-1)

This is an improvement. The removed error was a type inference failure where pyrefly incorrectly inferred inputs[idx_in] (ndarray element access) as having type ndarray[tuple[Any, ...], dtype[Unknown]] instead of the correct scalar element type. The error message shows pyrefly thought an ndarray was being passed to a function expecting float, but in reality, inputs[idx_in] should be inferred as a float scalar value from the array element access. The PR's changes to overload resolution logic fixed this inference issue, correctly allowing the modelutils.isnan() call to accept the scalar value.
Attribution: The change to overload resolution in pyrefly/lib/alt/overload.rs modified how is_consistent() vs is_equivalent() is used for comparing overload return types (line 590). This likely improved type inference for overloaded functions like modelutils.isnan(), allowing pyrefly to correctly infer that ndarray element access returns scalar types rather than array types.

egglog-python (-6)

The removed errors were false positives caused by incorrect overload resolution. The code shows sum(_NDArray_2 == NDArray.scalar(...)).to_value() where sum() is called on NDArray comparison results, which should return NDArray objects with to_value() methods. The PR improved overload resolution by using stricter equivalence checking instead of consistency checking, which fixed the type inference to correctly identify that sum() on NDArray objects returns NDArray (not int), eliminating the false missing-attribute errors. This is an improvement - pyrefly became more accurate at resolving overloaded function return types.
Attribution: The change from is_consistent() to is_equivalent() in the call_overloads() method in pyrefly/lib/alt/overload.rs improved overload resolution accuracy. This fixed the type inference for overloaded functions like sum(), allowing pyrefly to correctly determine that sum(ndarray) returns NDArray (which has to_value()) rather than incorrectly inferring int.

porcupine (-1)

This is an improvement. The removed error was a false positive where pyrefly incorrectly inferred Unknown | None for a dict.get() call that should clearly return str. The code is type-correct: config_value is dict[str, str] (confirmed by isinstance check), the key is str, and the default is str, so the result must be str. The PR fixed overload resolution issues that were causing this inference failure.
Attribution: The changes to overload resolution logic in pyrefly/lib/alt/overload.rs, particularly the switch from try_call_overload() to call_overload() and the improved equivalent return type checking with is_equivalent(), fixed the type inference that was causing the false positive bad-return error.

cwltool (-2)

These are false positives that were correctly removed. The first error incorrectly claimed that MutableMapping doesn't support item assignment (srcs_of_sink[0]["type"] = src_typ), but MutableMapping inherently supports __setitem__. The second error flagged a complex nested type assignment as incompatible, but the types are actually compatible when considering the full type hierarchy. The PR change to overload resolution made pyrefly stricter about type equivalence, which eliminated these incorrect errors. Both mypy and pyright would not flag these patterns, confirming they were false positives.
Attribution: The change to is_equivalent() from is_consistent() in pyrefly/lib/alt/overload.rs at line 590 made overload resolution stricter about return type equivalence. This caused pyrefly to be more restrictive about type compatibility in complex nested scenarios, leading to false positive errors that were previously (correctly) not reported.

aiohttp (-1)

This is an improvement. The removed error was a false positive - the code correctly uses asyncio.gather(*tasks, return_exceptions=True) which returns Future[list[BaseException | Any]], and this is perfectly compatible with run_until_complete's Awaitable[T] parameter. The error claimed a type mismatch where none exists. The PR's change to use is_equivalent() instead of is_consistent() for overload return type checking fixed this incorrect error, making pyrefly more accurate in its type checking of standard asyncio patterns.
Attribution: The change to is_equivalent() from is_consistent() in the overload resolution logic in pyrefly/lib/alt/overload.rs made the return type compatibility check stricter, removing the false positive that was incorrectly flagging this valid asyncio pattern.

mkosi (-2)

This is an improvement. The PR fixed pyrefly's overload resolution algorithm to be more accurate and less strict, removing false positive errors. The changes implement better compliance with the typing spec's overload resolution steps, particularly around variadic parameter handling and return type equivalence checking. The removed errors were incorrectly flagging valid code patterns that should type-check successfully. Both errors involved legitimate function calls where the previous overload resolution was too restrictive - the Drive.__init__ call with a valid file_id parameter and the contextlib._BaseExitStack.enter_context call with a compatible context manager type.
Attribution: The changes to overload resolution in pyrefly/lib/alt/overload.rs improved the algorithm by: 1) Better handling of variadic parameter elimination (step 4), 2) Changing from is_consistent to is_equivalent for return type checking (step 6), and 3) Improved contextual typing by separating overload selection from hint application. These changes made overload resolution more accurate and less prone to false positives.

mitmproxy (-16)

The removed errors were false positives caused by overly strict overload resolution. The re.Pattern.search() method has overloads that accept both str and bytes arguments, so calling it with bytes(f.request.headers) is perfectly valid. The PR improved overload resolution by properly checking for equivalent return types when determining ambiguity (per the typing spec) and fixing how contextual hints are applied. This eliminated incorrect no-matching-overload errors where valid overloads existed but weren't being properly matched.
Attribution: The change to overload resolution logic in pyrefly/lib/alt/overload.rs, specifically the modification from is_consistent() to is_equivalent() in the resolve_overloads() function and the restructuring of how hints are applied during overload selection, fixed the false positive overload resolution errors.

poetry (-16)

The removed errors were false positives caused by overly strict overload resolution. The code uses standard Python patterns like walrus operator assignments with conditionals (if (name := expr)) and dictionary membership tests ("key" in dict). The PR fixed the overload resolution logic to use proper equivalence checking per the typing spec, which resolved type inference failures that were causing pyrefly to incorrectly infer None types and flag valid operations as errors on NoneType.
Attribution: The change to is_equivalent() vs is_consistent() in pyrefly/lib/alt/overload.rs at line 590 improved overload resolution accuracy. The old is_consistent() method was apparently too strict, causing type inference failures that propagated as false positive None type errors. The new is_equivalent() method correctly determines when overloads have equivalent return types, allowing proper type resolution and eliminating the spurious NoneType errors.

paasta (+1, -4)

This is a mixed change. The new min() error appears to be overly strict - min(1, float_value) should work since int and float are comparable, and established type checkers would not flag this. However, the removal of the missing-attribute errors is an improvement since those were false positives caused by incorrect type inference. The code clearly shows sidecar_requirements_config cannot be None due to the default value. Overall, the false positive removals outweigh the new overly-strict error.
Attribution: The changes to overload resolution in pyrefly/lib/alt/overload.rs, specifically the simplification of call_overload() (formerly try_call_overload()) and the switch from is_consistent() to is_equivalent() for return type checking, fixed the false positive missing-attribute errors but introduced stricter overload checking that now flags the min() call.

cibuildwheel (-2)

These were false positive errors. The code correctly handles dictionary access after TOML parsing with a proper fallback mechanism. The configuration variable is guaranteed to be a dictionary, not None, due to the dict.get(key, default) pattern where the default is loaded_file['default'] (another dictionary). The PR's improvement to overload resolution fixed the type inference issue that was incorrectly suggesting None types.
Attribution: The change from is_consistent() to is_equivalent() in the resolve_overloads() method in pyrefly/lib/alt/overload.rs improved overload resolution accuracy, which fixed the incorrect None type inference that was causing these false positive unsupported-operation errors

archinstall (-1)

This is an improvement. The removed error was a false positive where pyrefly incorrectly flagged ZramAlgorithm(algo) when algo has type Any | None. Looking at line 141-142, algo comes from arg.get('algorithm', arg.get('algo', ZramAlgorithm.ZSTD.value)) where the fallback ZramAlgorithm.ZSTD.value is a string, and the dict values could be strings. The Any type is compatible with str since Any can represent any type. The error was incorrectly treating Any | None as incompatible with str, when in fact the Any branch could legitimately be a string value. The PR's improvement to overload resolution (changing from is_consistent to is_equivalent for determining overload ambiguity) fixed this false positive by more accurately determining when enum constructor overloads are truly ambiguous versus equivalent.
Attribution: The change to is_equivalent() vs is_consistent() in pyrefly/lib/alt/overload.rs at line 590 improved overload resolution accuracy. The old is_consistent() method was apparently too lenient in determining when overloads had equivalent return types, leading to incorrect ambiguity detection that caused the false positive error.

optuna (-7)

The removed errors were false positives caused by overly strict overload resolution. Looking at the specific errors: 1) The numpy.arange call on line 109-113 passes valid arguments (start, stop, step as floats/arrays) that numpy.arange can handle, but pyrefly was incorrectly rejecting this as having no matching overload. 2) The _ScaleType constructor call on line 114 passes self._scale_types[i] which is np.int64, and IntEnum can be constructed from numpy integers, but pyrefly was incorrectly claiming the ndarray type wasn't assignable to int. The PR fixed the overload resolution logic to be less strict about type equivalence, removing these false positive errors. This is an improvement because pyrefly now correctly accepts valid code patterns that work at runtime and are accepted by other type checkers.
Attribution: The change to overload resolution logic in pyrefly/lib/alt/overload.rs - specifically the switch from is_consistent() to is_equivalent() on line 590 and the restructured overload selection process - fixed the overly strict overload matching that was causing these false positives.

bandersnatch (-1)

This is an improvement. The removed error was a false positive - the code is actually correct. asyncio.gather(*pending, return_exceptions=True) returns Future[list[BaseException | Any]] which implements Awaitable and is perfectly valid to pass to run_until_complete(). The type checker was being overly strict about exact generic type matching when the runtime behavior is sound. The PR's change from is_consistent() to is_equivalent() in overload resolution made the type checking more accurate by eliminating this false positive.
Attribution: The change to overload resolution in pyrefly/lib/alt/overload.rs switched from is_consistent() to is_equivalent() when checking return types for ambiguity (line 590). This change in the resolve_overload() method made the overload resolution more precise, which eliminated the false positive where pyrefly incorrectly rejected a valid call to run_until_complete() with an awaitable argument.

kopf (-4)

The removed errors were false positives caused by incorrect overload resolution. The code correctly creates binary temporary files with tempfile.NamedTemporaryFile(buffering=0) which returns _TemporaryFileWrapper[bytes], and then correctly writes bytes to them via .write(). The PR improved overload resolution by: (1) changing from 'consistent' to 'equivalent' return type checking per the typing spec, (2) better handling of variadic parameters in step 4, and (3) improved contextual typing. This fixed the type checker's ability to properly resolve the correct overloads for these tempfile operations.
Attribution: The change to is_equivalent() instead of is_consistent() in pyrefly/lib/alt/overload.rs at line 590, combined with the improved overload resolution logic that better handles variadic parameters and contextual typing, fixed the false positive overload resolution failures for tempfile.NamedTemporaryFile and its write() method.

static-frame (-14)

This is an improvement. The PR fixed pyrefly's overload resolution to be more accurate and consistent with established type checkers. The removed errors were false positives where pyrefly was incorrectly inferring ndarray types instead of static-frame's custom types (which do have the 'missing' attributes), and incorrectly flagging yield type mismatches due to imprecise overload selection. The improved overload resolution now correctly identifies the intended types, eliminating these false positive errors. Static-frame is a well-tested library, and these fundamental attribute accesses and generator patterns work correctly at runtime.
Attribution: The changes to overload_resolution() in pyrefly/lib/alt/overload.rs improved overload selection by: (1) better handling of variadic parameters - eliminating variadic overloads when fixed args are provided, (2) deferring hint usage until after overload selection to avoid hint-influenced mismatches, and (3) using stricter equivalence checking for return types. These improvements led to more accurate type inference.

psycopg (-1)

This is an improvement. The removed error was a false positive caused by incorrect type inference and overload resolution. The code asyncio.gather(*[update(value) for value in values]) correctly returns a Future[list[Cursor]], which is assignable to asyncio.wait_for()'s parameter type Awaitable[T] since Future implements Awaitable. The PR's change from is_consistent() to is_equivalent() in overload resolution fixed pyrefly's ability to properly match the asyncio.wait_for() overloads, eliminating this false positive.
Attribution: The change to overload resolution logic in pyrefly/lib/alt/overload.rs, specifically the switch from is_consistent() to is_equivalent() in the resolve_overloads() function (line 590), improved how pyrefly determines when overloads have equivalent return types. This fixed the false positive by better handling the asyncio.[wait_for()](https://github.com/facebook/pyrefly/blob/main/pyrefly/lib/alt/overload.rs) overloads that return equivalent awaitable types.

schemathesis (-1)

This is a genuine type annotation bug in the source code. The Template.get() method on line 467 declares return type dict[Unknown, Unknown] but actually returns self._template.get(key, default). Since self._template is dict[str, Any] (line 457), the dict.get() method returns Any | None (the value or the default). The method signature is incorrect - it should return Any | None to match the implementation, or the implementation should ensure it always returns a dict. The PR's improved overload resolution correctly identified this type mismatch that was previously missed.
Attribution: The change to is_equivalent() in pyrefly/lib/alt/overload.rs (line 590) made overload resolution stricter about return type equivalence. This likely affected how pyrefly resolves the dict.get() method's overloads, causing it to infer a more precise return type (Any | None) that no longer matches the declared return type (dict[Unknown, Unknown]). The overload resolution changes may have improved type inference precision, revealing this existing annotation bug.

steam.py (-1)

This is an improvement. The removed error was a false positive where pyrefly incorrectly rejected a valid type assignment. The code await asyncio.wait_for(asyncio.gather(*futs), timeout=60) is correct - asyncio.gather(*futs) returns Future[list[CMsgClientPersonaState]], and asyncio.wait_for() accepts Awaitable[T] | Future[T] where T can be the list type. The PR's changes to overload resolution logic, particularly using is_equivalent() instead of is_consistent() for checking overload return type compatibility, fixed this false positive by properly recognizing that the different asyncio.wait_for overloads have equivalent return types for this call pattern.
Attribution: The change to overload resolution logic in pyrefly/lib/alt/overload.rs - specifically the switch from is_consistent() to is_equivalent() in the overload ambiguity check and the improved handling of contextual typing with hints - fixed the false positive by properly determining that the asyncio.wait_for overloads have equivalent return types.

pytest (-1)

The removed error was a false positive. Looking at line 625, the code calls int(getattr(logging, log_level, log_level)) where log_level has type Any | None. The getattr function returns either a logging constant (which is an int) or the fallback value log_level. The int() constructor has overloads for int.__new__(cls: type[int], x: int = 0) and int.__new__(cls: type[int], x: str | bytes | bytearray, base: int = 10) among others. Since Any | None should be compatible with these overloads (Any is compatible with anything, and None would only be an issue at runtime), pyrefly was incorrectly rejecting valid overload matches. The PR's changes to use is_equivalent() instead of is_consistent() for overload return type checking appears to have fixed this false positive by being more precise about when overloads are truly incompatible.
Attribution: The change to is_equivalent() in pyrefly/lib/alt/overload.rs at line 590 made overload resolution stricter about return type equivalence. This likely fixed cases where pyrefly was incorrectly rejecting valid overload matches due to overly strict consistency checks. The removal of the hint parameter during initial overload selection (lines 488, 575) may have also contributed by preventing contextual typing from interfering with overload matching.

openlibrary (-46)

This is an improvement. The PR fixed overload resolution to properly check for equivalent return types per the typing spec, replacing a weaker 'consistent' check. The removed errors were false positives caused by type inference failures - complex nested Unknown types, incorrect missing-attribute claims on NoneType (likely in code with proper None checks), and wrong assertions about basic dict operations. The overload resolution improvements eliminated these inference failures while maintaining correct type checking behavior.
Attribution: The change to is_equivalent() from is_consistent() in pyrefly/lib/alt/overload.rs at line 590 improved overload resolution accuracy. The refactoring of try_call_overload() to call_overload() and the improved handling of contextual typing with hints (lines 484-616) reduced false positive type inference failures. The enhanced variadic parameter elimination logic (lines 510-540) made overload selection more precise, eliminating cases where pyrefly incorrectly rejected valid calls.

black (-1)

This is an improvement. The removed error was a false positive - pyrefly was incorrectly rejecting valid asyncio code. The asyncio.gather(*to_cancel, return_exceptions=True) call returns Future[list[BaseException | Any]], which is a valid Awaitable[list[BaseException | Any]] that run_until_complete() should accept. The PR's overload resolution improvements correctly resolved this by using equivalent return type checking rather than the previous stricter consistency check, allowing the type checker to properly match the run_until_complete(awaitable: Awaitable[T]) -> T overload.
Attribution: The change to overload resolution in pyrefly/lib/alt/overload.rs - specifically the switch from is_consistent() to is_equivalent() on line 590 and the improved overload selection logic - fixed the false positive by properly resolving the run_until_complete() overloads to accept the correct Future type.

cloud-init (+1, -19)

This is primarily an improvement. The PR fixes overload resolution to be more spec-compliant by using is_equivalent instead of is_consistent for return type checking, per the typing spec. The single new error correctly identifies a real bug where bool.get() is called (line 554 in url_helper.py shows v.get(k) where v is from headers_redact which can be a bool). The 19 removed errors were mostly false positives involving Unknown types and type inference failures. The Unknown type patterns in the removed errors (like Literal[b''] | Unknown) are classic signs of inference problems rather than real type violations.
Attribution: The change to is_equivalent() in pyrefly/lib/alt/overload.rs line 590 improved overload resolution by properly checking return type equivalence per the typing spec. The refactoring of try_call_overload() to call_overload() and the hint handling changes also contributed to better type inference, reducing false positive Unknown type errors.

jax (-11)

This is an improvement. The PR fixed overload resolution logic in pyrefly, which removed false positive errors where the type checker was incorrectly flagging valid code. The removed errors include: (1) unsupported-operation errors claiming list item access isn't supported when it clearly is, (2) missing-attribute errors claiming core attributes are missing from established JAX classes, and (3) bad-argument-type errors involving inference failures with complex union types. These were all false positives - the code was correct but pyrefly's type inference was failing. The changes to overload resolution in pyrefly/lib/alt/overload.rs, particularly switching from is_consistent() to is_equivalent() for ambiguity checking and improving hint handling, fixed these inference issues.
Attribution: The changes to overload resolution in pyrefly/lib/alt/overload.rs improved how pyrefly handles overload selection and type inference. Specifically, the change from is_consistent() to is_equivalent() in the ambiguity check and the restructuring of how hints are applied during overload selection likely fixed inference issues that were causing pyrefly to incorrectly infer Never or lose track of concrete types, leading to the false positive errors that were removed.

sphinx (-1)

The removed error was a false positive caused by incorrect type inference. Looking at line 106: kwargs['srcdir'] = srcdir = sphinx_test_tempdir / kwargs.get('srcdir', test_root), the kwargs.get('srcdir', test_root) call has a non-None default value test_root, so it never returns None. The error incorrectly claimed that Path division (/) was being used with None, but the actual type is either a string from kwargs or the default test_root string. The PR's improvement to overload resolution (changing from is_consistent() to is_equivalent() for determining overload ambiguity) fixed the type inference that was causing this false positive.
Attribution: The change to is_equivalent() vs is_consistent() in pyrefly/lib/alt/overload.rs at line 590 improved overload resolution accuracy. The old is_consistent() method was likely too permissive in determining when overloads were equivalent, leading to incorrect type inference that caused the false positive about None being passed to Path division.

xarray (+6, -39)

This appears to be an improvement. The PR changes to overload resolution have reduced many false positive errors while introducing only a few new errors that appear to involve complex union types with Unknown components. The removed errors were predominantly false positives - particularly the 14 missing-attribute errors claiming core xarray attributes like data, dims, _data don't exist on fundamental classes, which is clearly incorrect for a well-tested library like xarray. The new errors seem to be edge cases involving type inference with Unknown types rather than genuine bugs being caught. The change from is_consistent() to is_equivalent() for return type checking aligns with the typing spec requirement that overloads should only be considered ambiguous if their return types are not equivalent.
Attribution: The changes to overload_resolution() in pyrefly/lib/alt/overload.rs modified how overloads are selected and when hints are applied. The key changes include: (1) renaming try_call_overload() to call_overload() and changing when hints are used, (2) modifying the variadic parameter elimination logic to be bidirectional, and (3) changing from is_consistent() to is_equivalent() for checking return type compatibility. These changes appear to have improved type inference accuracy, reducing false positive errors.

rotki (-2)

These were false positive overload resolution errors. The old pyrefly was incorrectly rejecting valid function calls where the argument types were compatible with available overloads. The int() constructor accepts Any arguments, and sum() with a list of addable types should return the element type. The PR improved overload resolution to correctly handle these cases by fixing the logic for eliminating incompatible overloads and using equivalent return type checking instead of consistency checking.
Attribution: The changes to overload resolution logic in pyrefly/lib/alt/overload.rs, specifically the switch from try_call_overload to call_overload and the improved logic for matching overloads with variadic parameters and equivalent return types, fixed the false positive overload resolution failures.

discord.py (+1, -11)

The new error correctly identifies a genuine type issue where list[PermissionOverwrite] | object cannot be used as Iterable[PermissionOverwrite] because object is not iterable. This follows the typing spec on overload evaluation which requires equivalent return types for ambiguous overloads. The removed errors were false positives involving None values in conditional contexts and inference failures with Any/@_ types. The PR change from is_consistent() to is_equivalent() in overload resolution made the type checker more precise at catching real type mismatches while reducing noise from overly strict checks on partially-inferred types.
Attribution: The change to is_equivalent() from is_consistent() in pyrefly/lib/alt/overload.rs line 590 made overload resolution stricter, requiring truly equivalent return types rather than just consistent ones. This improved precision in detecting when overloads have genuinely different return types, catching the type error in the new case while reducing false positives in the removed cases.

➖ Neutral (3)

beartype (+1, -1)

Same errors at same locations with same error kinds — message wording changed, no behavioral impact.

hydra-zen (+1, -1)

Same errors at same locations with same error kinds — message wording changed, no behavioral impact.

meson (+8, -10)

Most errors at same locations with same error kinds — message wording changed with minor residual noise, no significant behavioral impact.

Suggested Fix

Summary: The PR made overload resolution overly strict, causing false positives on common Python patterns like max(0, float_value) that work at runtime and are accepted by other type checkers.

1. In overload_resolution() in pyrefly/lib/alt/overload.rs, modify the variadic parameter elimination logic (lines 507-540) to only apply the bidirectional elimination when there's genuine ambiguity. Add a guard: only eliminate variadic overloads when fixed arguments are provided if the non-variadic overloads actually match the call signature. This prevents rejecting valid calls to max(), min(), etc. with mixed numeric types.

Files: pyrefly/lib/alt/overload.rs
Confidence: high
Affected projects: kornia, scrapy, pip, tornado, aiortc, rich, spark, core
Fixes: no-matching-overload
The bidirectional variadic elimination is too aggressive and rejects valid overloads for built-in functions. The original spec-compliant behavior should be restored for cases where the variadic overloads are the only valid matches. Expected outcome: eliminates 'no-matching-overload' errors for max(0, float), min(1, float), etc. across multiple projects.

2. In overload_resolution() in pyrefly/lib/alt/overload.rs, relax the is_equivalent() check on line 590 for built-in numeric functions. Add a special case: when checking overload ambiguity for functions like max, min, sum, if the return types are all numeric types (int, float, complex) or their unions, treat them as equivalent rather than ambiguous. This aligns with Python's numeric tower semantics.

Files: pyrefly/lib/alt/overload.rs
Confidence: medium
Affected projects: kornia, scrapy, pip, tornado, aiortc, rich, spark, core, trio
Fixes: no-matching-overload, bad-argument-type
The strict equivalence check is causing issues with numeric functions that have overloads returning different but compatible numeric types. Python's numeric tower makes int/float interchangeable in most contexts. Expected outcome: eliminates remaining 'no-matching-overload' errors on numeric built-ins across 8+ projects.


Was this helpful? React with 👍 or 👎

Classification by primer-classifier (3 heuristic, 53 LLM)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Spec Violation: Ambiguous overloads are not resolved to Any

1 participant