Skip to content

Remove list.__add__ overloads. #14282

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

randolf-scholz
Copy link
Contributor

@randolf-scholz randolf-scholz commented Jun 16, 2025

See #14283.

The overloads on list.__add__ lead to divergence of mypy and pyright. Code sample in pyright playground, https://mypy-play.net/?mypy=latest&python=3.12&gist=abf6a8834020af17a16bd8cfb44b2f10

from typing import Any, overload

class ListA[T]:  # emulates builtins list
    @overload
    def __add__(self, other: "ListA[T]", /) -> "ListA[T]": return ListA()
    @overload
    def __add__[S](self, other: "ListA[S]", /) -> "ListA[T | S]": return ListA()
    
    
class ListB[T]:  # without overloads
    def __add__[S](self, other: "ListB[S]", /) -> "ListB[T | S]": return ListB()

                                            # mypy              | pyright                             
reveal_type( list[str]() + list[str]() )    # list[str]         | list[str]          ✅
reveal_type( list[str]() + list[int]() )    # list[str | int]   | list[str | int]    ✅
reveal_type( list[str]() + list[Any]() )    # list[Any]         | list[str]          ❌

reveal_type( ListA[str]() + ListA[str]() )  # ListA[str]        | ListA[str]         ✅
reveal_type( ListA[str]() + ListA[int]() )  # ListA[str | int]  | ListA[str | int]   ✅
reveal_type( ListA[str]() + ListA[Any]() )  # ListA[Any]        | ListA[str]         ❌

reveal_type( ListB[str]() + ListB[str]() )  # ListB[str]        | ListB[str]         ✅
reveal_type( ListB[str]() + ListB[int]() )  # ListB[str | int]  | ListB[str | int]   ✅
reveal_type( ListB[str]() + ListB[Any]() )  # ListB[str | Any]  | ListB[str | Any]   ✅

A comment from 3 years added in #8293 states that

# Overloading looks unnecessary, but is needed to work around complex mypy problems

I want to see the impact on mypy primer, and whether this comment is still valid given that mypy has had several major releases since.

@randolf-scholz randolf-scholz changed the title do not merge Check usefulness of list.__add__ overloads. Jun 16, 2025
Copy link
Contributor

Diff from mypy_primer, showing the effect of this PR on open source code:

meson (https://github.com/mesonbuild/meson)
+ mesonbuild/scripts/gettext.py:54:70: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/utils/universal.py:2365:87: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/cmake/interpreter.py:1061:69: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/cmake/interpreter.py:1199:60: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ run_mypy.py:154:62: error: Unsupported operand types for + ("list[str | Any]" and "list[str]")  [operator]
+ mesonbuild/modules/gnome.py:1548:23: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/modules/gnome.py:1565:77: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/depfixer.py:436:66: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:62:40: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:73:40: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:83:35: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:120:35: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:128:35: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:134:59: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:140:64: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:147:64: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:156:47: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/scripts/coverage.py:172:40: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/modules/_qt.py:453:114: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/modules/_qt.py:474:126: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/modules/_qt.py:872:86: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/modules/_qt.py:916:173: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ mesonbuild/mtest.py:2250:47: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]

pydantic (https://github.com/pydantic/pydantic)
- pydantic/aliases.py:29: error: Incompatible types in assignment (expression has type "list[str]", variable has type "list[int | str]")  [assignment]
- pydantic/aliases.py:29: note: "list" is invariant -- see https://mypy.readthedocs.io/en/stable/common_issues.html#variance
- pydantic/aliases.py:29: note: Consider using "Sequence" instead, which is covariant
- pydantic/aliases.py:29: error: Argument 1 to "list" has incompatible type "tuple[str | int, ...]"; expected "Iterable[str]"  [arg-type]
+ pydantic/aliases.py:29: error: Argument 1 to "list" has incompatible type "tuple[str | int, ...]"; expected "Iterable[int]"  [arg-type]

cloud-init (https://github.com/canonical/cloud-init)
+ cloudinit/config/cc_mounts.py:420: error: Incompatible return value type (got "list[list[str | Any | None]]", expected "list[list[str]]")  [return-value]

zulip (https://github.com/zulip/zulip)
+ zerver/lib/export.py:1703: error: Unsupported operand types for + ("list[dict[str, Any]]" and "list[dict[str, Any]]")  [operator]

pandas (https://github.com/pandas-dev/pandas)
+ pandas/io/stata.py:1458: error: List comprehension has incompatible type List[object]; expected List[str | dtype[Any]]  [misc]
+ pandas/core/frame.py:11090: error: Incompatible return value type (got "Any | DataFrame | Series", expected "DataFrame")  [return-value]
+ pandas/core/groupby/ops.py:800: error: No overload variant of "unique" matches argument types "list[int]", "bool"  [call-overload]
+ pandas/core/groupby/ops.py:800: note: Possible overload variants:
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False] = ..., return_inverse: Literal[False] = ..., return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> ndarray[tuple[Any, ...], dtype[_ScalarT]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False] = ..., return_inverse: Literal[False] = ..., return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> ndarray[tuple[Any, ...], dtype[Any]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[False] = ..., return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[False] = ..., return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False], return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False] = ..., *, return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False], return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False] = ..., *, return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False], return_inverse: Literal[False], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False] = ..., return_inverse: Literal[False] = ..., *, return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False], return_inverse: Literal[False], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False] = ..., return_inverse: Literal[False] = ..., *, return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[True], return_counts: Literal[False] = ..., axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[False], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[False] = ..., *, return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[False], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[False] = ..., *, return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False], return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[False] = ..., *, return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False], return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[False] = ..., *, return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def [_ScalarT: generic[Any]] unique(ar: _SupportsArray[dtype[_ScalarT]] | _NestedSequence[_SupportsArray[dtype[_ScalarT]]], return_index: Literal[True], return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[_ScalarT]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/core/groupby/ops.py:800: note:     def unique(ar: _Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], return_index: Literal[True], return_inverse: Literal[True], return_counts: Literal[True], axis: SupportsIndex | None = ..., *, equal_nan: bool = ...) -> tuple[ndarray[tuple[Any, ...], dtype[Any]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]], ndarray[tuple[Any, ...], dtype[signedinteger[_32Bit | _64Bit]]]]
+ pandas/tests/strings/conftest.py:6: error: Need type annotation for "_any_string_method"  [var-annotated]
+ pandas/tests/groupby/conftest.py:130: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ pandas/tests/frame/test_constructors.py:2318: error: Unsupported operand types for + ("list[ExtensionDtype | str | dtype[Any] | type[object] | type[str] | type[complex] | type[bool]]" and "list[ExtensionDtype | str | dtype[Any] | type[str] | type[complex] | type[bool] | type[object]]")  [operator]
+ pandas/tests/frame/test_constructors.py:2328: error: Unsupported operand types for + ("list[ExtensionDtype | str | dtype[Any] | type[object] | type[str] | type[complex] | type[bool]]" and "list[ExtensionDtype | str | dtype[Any] | type[str] | type[complex] | type[bool] | type[object]]")  [operator]
+ pandas/tests/arrays/test_datetimes.py:69: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ pandas/conftest.py:1638: error: Unsupported operand types for + ("list[ExtensionDtype | str | dtype[Any] | type[object] | type[str] | type[complex] | type[bool]]" and "list[ExtensionDtype | str | dtype[Any] | type[str] | type[complex] | type[bool] | type[object]]")  [operator]

freqtrade (https://github.com/freqtrade/freqtrade)
+ freqtrade/exchange/exchange_utils.py:119: error: Incompatible types (expression has type "list[dict[str, Any] | dict[str, str]]", TypedDict item "trade_modes" has type "list[TradeModeType]")  [typeddict-item]
+ freqtrade/data/entryexitanalysis.py:287: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ freqtrade/data/entryexitanalysis.py:292: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ freqtrade/data/entryexitanalysis.py:298: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]
+ freqtrade/data/entryexitanalysis.py:299: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]

spark (https://github.com/apache/spark)
+ python/pyspark/pandas/groupby.py:3682: error: Unsupported operand types for + ("list[Series[Any]]" and "list[Series[Any]]")  [operator]

mkosi (https://github.com/systemd/mkosi)
+ mkosi/qemu.py:447:22: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]

paasta (https://github.com/yelp/paasta)
+ paasta_tools/config_utils.py:101: error: Unsupported operand types for + ("List[str]" and "List[str]")  [operator]

mypy (https://github.com/python/mypy)
+ mypyc/crash.py:29: error: Unsupported operand types for + ("list[FrameSummary]" and "list[FrameSummary]")  [operator]
+ mypyc/crash.py:29: note: See https://mypy.rtfd.io/en/stable/_refs.html#code-operator for more info
+ mypy/strconv.py:454: error: Unsupported operand types for + ("list[Any]" and "list[Union[str, tuple[str, list[Any]]]]")  [operator]
+ mypy/errors.py:1285: error: Unsupported operand types for + ("list[FrameSummary]" and "StackSummary")  [operator]
+ mypy/fastparse.py:1612: error: Unsupported operand types for + ("list[expr]" and "list[expr]")  [operator]
+ mypy/test/testcmdline.py:77: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]

core (https://github.com/home-assistant/core)
+ homeassistant/components/monzo/sensor.py:91: error: Unsupported operand types for + ("list[MonzoSensor]" and "list[MonzoSensor]")  [operator]

materialize (https://github.com/MaterializeInc/materialize)
+ misc/python/materialize/cli/ci_upload_heap_profiles.py:42: error: Unsupported operand types for + ("list[str | Any]" and "list[str]")  [operator]

spack (https://github.com/spack/spack)
+ lib/spack/spack/environment/environment.py:1525: error: Incompatible return value type (got "tuple[list[Any], list[Any], list[tuple[Any, Any] | tuple[Any, None]]]", expected "tuple[list[Spec], list[Spec], list[tuple[Spec, Spec]]]")  [return-value]

prefect (https://github.com/PrefectHQ/prefect)
- src/prefect/utilities/callables.py:579: error: Incompatible types in assignment (expression has type "list[None]", variable has type "list[Optional[expr]]")  [assignment]
- src/prefect/utilities/callables.py:579: note: "list" is invariant -- see https://mypy.readthedocs.io/en/stable/common_issues.html#variance
- src/prefect/utilities/callables.py:579: note: Consider using "Sequence" instead, which is covariant

tornado (https://github.com/tornadoweb/tornado)
+ tornado/netutil.py:158: error: Incompatible types in assignment (expression has type "tuple[Union[int, Any], ...]", variable has type "Union[tuple[str, int], tuple[str, int, int, int], tuple[int, bytes]]")  [assignment]
+ tornado/autoreload.py:234: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]

scipy (https://github.com/scipy/scipy)
+ scipy/fft/_pocketfft/tests/test_basic.py:473: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]
+ scipy/fft/_pocketfft/tests/test_basic.py:483: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]
+ scipy/fft/_pocketfft/tests/test_basic.py:502: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]
+ scipy/fft/_pocketfft/tests/test_basic.py:512: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]
+ scipy/fftpack/tests/test_basic.py:429: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]
+ scipy/fftpack/tests/test_basic.py:439: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]
+ scipy/fftpack/tests/test_basic.py:458: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]
+ scipy/fftpack/tests/test_basic.py:468: error: Unsupported operand types for + ("list[int]" and "list[int]")  [operator]
+ scipy/optimize/tests/test_optimize.py:1289: error: Unsupported operand types for + ("list[str]" and "list[str]")  [operator]

graphql-core (https://github.com/graphql-python/graphql-core)
+ src/graphql/type/validate.py:409: error: Unsupported operand types for + ("list[NamedTypeNode]" and "list[NamedTypeNode]")  [operator]
+ src/graphql/validation/rules/overlapping_fields_can_be_merged.py:89: error: Unsupported operand types for + ("list[FieldNode]" and "list[FieldNode]")  [operator]

@randolf-scholz randolf-scholz changed the title Check usefulness of list.__add__ overloads. Remove list.__add__ overloads. Jun 16, 2025
@randolf-scholz
Copy link
Contributor Author

randolf-scholz commented Jun 16, 2025

There must be a strange bug in mypy. Take this example from the mypy primer above

If I change the offending line (Unsupported operand types for + ("list[FrameSummary]" and "list[FrameSummary]")) from

for s in traceback.format_list(tb + tb2):

to

dummy = tb + tb2
for s in traceback.format_list(dummy):

Then the file passes without issue. (using python -m mypy.stubtest --custom-typeshed-dir=../typeshed/ tmp)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant