diff --git a/CHANGELOG.md b/CHANGELOG.md index d8237795112b..f28cdb1ccc25 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,14 +1,266 @@ # Mypy Release Notes -## Unreleased +## Next release -... +Stubgen will now include `__all__` in its output if it is in the input file (PR [16356](https://github.com/python/mypy/pull/16356)). + +## Mypy 1.7 + +We’ve just uploaded mypy 1.7 to the Python Package Index ([PyPI](https://pypi.org/project/mypy/)). Mypy is a static type checker for Python. This release includes new features, performance improvements and bug fixes. You can install it as follows: + + python3 -m pip install -U mypy + +You can read the full documentation for this release on [Read the Docs](http://mypy.readthedocs.io). + +#### Using TypedDict for `**kwargs` Typing + +Mypy now has support for using `Unpack[...]` with a TypedDict type to annotate `**kwargs` arguments enabled by default. Example: + +```python +# Or 'from typing_extensions import ...' +from typing import TypedDict, Unpack + +class Person(TypedDict): + name: str + age: int + +def foo(**kwargs: Unpack[Person]) -> None: + ... + +foo(name="x", age=1) # Ok +foo(name=1) # Error +``` + +The definition of `foo` above is equivalent to the one below, with keyword-only arguments `name` and `age`: + +```python +def foo(*, name: str, age: int) -> None: + ... +``` + +Refer to [PEP 692](https://peps.python.org/pep-0692/) for more information. Note that unlike in the current version of the PEP, mypy always treats signatures with `Unpack[SomeTypedDict]` as equivalent to their expanded forms with explicit keyword arguments, and there aren't special type checking rules for TypedDict arguments. + +This was contributed by Ivan Levkivskyi back in 2022 (PR [13471](https://github.com/python/mypy/pull/13471)). + +#### TypeVarTuple Support Enabled (Experimental) + +Mypy now has support for variadic generics (TypeVarTuple) enabled by default, as an experimental feature. Refer to [PEP 646](https://peps.python.org/pep-0646/) for the details. + +TypeVarTuple was implemented by Jared Hance and Ivan Levkivskyi over several mypy releases, with help from Jukka Lehtosalo. + +Changes included in this release: + + * Fix handling of tuple type context with unpacks (Ivan Levkivskyi, PR [16444](https://github.com/python/mypy/pull/16444)) + * Handle TypeVarTuples when checking overload constraints (robjhornby, PR [16428](https://github.com/python/mypy/pull/16428)) + * Enable Unpack/TypeVarTuple support (Ivan Levkivskyi, PR [16354](https://github.com/python/mypy/pull/16354)) + * Fix crash on unpack call special-casing (Ivan Levkivskyi, PR [16381](https://github.com/python/mypy/pull/16381)) + * Some final touches for variadic types support (Ivan Levkivskyi, PR [16334](https://github.com/python/mypy/pull/16334)) + * Support PEP-646 and PEP-692 in the same callable (Ivan Levkivskyi, PR [16294](https://github.com/python/mypy/pull/16294)) + * Support new `*` syntax for variadic types (Ivan Levkivskyi, PR [16242](https://github.com/python/mypy/pull/16242)) + * Correctly handle variadic instances with empty arguments (Ivan Levkivskyi, PR [16238](https://github.com/python/mypy/pull/16238)) + * Correctly handle runtime type applications of variadic types (Ivan Levkivskyi, PR [16240](https://github.com/python/mypy/pull/16240)) + * Support variadic tuple packing/unpacking (Ivan Levkivskyi, PR [16205](https://github.com/python/mypy/pull/16205)) + * Better support for variadic calls and indexing (Ivan Levkivskyi, PR [16131](https://github.com/python/mypy/pull/16131)) + * Subtyping and inference of user-defined variadic types (Ivan Levkivskyi, PR [16076](https://github.com/python/mypy/pull/16076)) + * Complete type analysis of variadic types (Ivan Levkivskyi, PR [15991](https://github.com/python/mypy/pull/15991)) + +#### New Way of Installing Mypyc Dependencies + +If you want to install package dependencies needed by mypyc (not just mypy), you should now install `mypy[mypyc]` instead of just `mypy`: + +``` +python3 -m pip install -U 'mypy[mypyc]' +``` + +Mypy has many more users than mypyc, so always installing mypyc dependencies would often bring unnecessary dependencies. + +This change was contributed by Shantanu (PR [16229](https://github.com/python/mypy/pull/16229)). + +#### New Rules for Re-exports + +Mypy no longer considers an import such as `import a.b as b` as an explicit re-export. The old behavior was arguably inconsistent and surprising. This may impact some stub packages, such as older versions of `types-six`. You can change the import to `from a import b as b`, if treating the import as a re-export was intentional. + +This change was contributed by Anders Kaseorg (PR [14086](https://github.com/python/mypy/pull/14086)). + +#### Improved Type Inference + +The new type inference algorithm that was recently introduced to mypy (but was not enabled by default) is now enabled by default. It improves type inference of calls to generic callables where an argument is also a generic callable, in particular. You can use `--old-type-inference` to disable the new behavior. + +The new algorithm can (rarely) produce different error messages, different error codes, or errors reported on different lines. This is more likely in cases where generic types were used incorrectly. + +The new type inference algorithm was contributed by Ivan Levkivskyi. PR [16345](https://github.com/python/mypy/pull/16345) enabled it by default. + +#### Narrowing Tuple Types Using len() + +Mypy now can narrow tuple types using `len()` checks. Example: + +```python +def f(t: tuple[int, int] | tuple[int, int, int]) -> None: + if len(t) == 2: + a, b = t # Ok + ... +``` + +This feature was contributed by Ivan Levkivskyi (PR [16237](https://github.com/python/mypy/pull/16237)). + +#### More Precise Tuple Lengths (Experimental) + +Mypy supports experimental, more precise checking of tuple type lengths through `--enable-incomplete-feature=PreciseTupleTypes`. Refer to the [documentation](https://mypy.readthedocs.io/en/latest/command_line.html#enabling-incomplete-experimental-features) for more information. + +More generally, we are planning to use `--enable-incomplete-feature` to introduce experimental features that would benefit from community feedback. + +This feature was contributed by Ivan Levkivskyi (PR [16237](https://github.com/python/mypy/pull/16237)). + +#### Mypy Changelog + +We now maintain a [changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md) in the mypy Git repository. It mirrors the contents of [mypy release blog posts](https://mypy-lang.blogspot.com/). We will continue to also publish release blog posts. In the future, release blog posts will be created based on the changelog near a release date. + +This was contributed by Shantanu (PR [16280](https://github.com/python/mypy/pull/16280)). + +#### Mypy Daemon Improvements + + * Fix daemon crash caused by deleted submodule (Jukka Lehtosalo, PR [16370](https://github.com/python/mypy/pull/16370)) + * Fix file reloading in dmypy with --export-types (Ivan Levkivskyi, PR [16359](https://github.com/python/mypy/pull/16359)) + * Fix dmypy inspect on Windows (Ivan Levkivskyi, PR [16355](https://github.com/python/mypy/pull/16355)) + * Fix dmypy inspect for namespace packages (Ivan Levkivskyi, PR [16357](https://github.com/python/mypy/pull/16357)) + * Fix return type change to optional in generic function (Jukka Lehtosalo, PR [16342](https://github.com/python/mypy/pull/16342)) + * Fix daemon false positives related to module-level `__getattr__` (Jukka Lehtosalo, PR [16292](https://github.com/python/mypy/pull/16292)) + * Fix daemon crash related to ABCs (Jukka Lehtosalo, PR [16275](https://github.com/python/mypy/pull/16275)) + * Stream dmypy output instead of dumping everything at the end (Valentin Stanciu, PR [16252](https://github.com/python/mypy/pull/16252)) + * Make sure all dmypy errors are shown (Valentin Stanciu, PR [16250](https://github.com/python/mypy/pull/16250)) + +#### Mypyc Improvements + + * Generate error on duplicate function definitions (Jukka Lehtosalo, PR [16309](https://github.com/python/mypy/pull/16309)) + * Don't crash on unreachable statements (Jukka Lehtosalo, PR [16311](https://github.com/python/mypy/pull/16311)) + * Avoid cyclic reference in nested functions (Jukka Lehtosalo, PR [16268](https://github.com/python/mypy/pull/16268)) + * Fix direct `__dict__` access on inner functions in new Python (Shantanu, PR [16084](https://github.com/python/mypy/pull/16084)) + * Make tuple packing and unpacking more efficient (Jukka Lehtosalo, PR [16022](https://github.com/python/mypy/pull/16022)) + +#### Improvements to Error Reporting + + * Update starred expression error message to match CPython (Cibin Mathew, PR [16304](https://github.com/python/mypy/pull/16304)) + * Fix error code of "Maybe you forgot to use await" note (Jelle Zijlstra, PR [16203](https://github.com/python/mypy/pull/16203)) + * Use error code `[unsafe-overload]` for unsafe overloads, instead of `[misc]` (Randolf Scholz, PR [16061](https://github.com/python/mypy/pull/16061)) + * Reword the error message related to void functions (Albert Tugushev, PR [15876](https://github.com/python/mypy/pull/15876)) + * Represent bottom type as Never in messages (Shantanu, PR [15996](https://github.com/python/mypy/pull/15996)) + * Add hint for AsyncIterator incompatible return type (Ilya Priven, PR [15883](https://github.com/python/mypy/pull/15883)) + * Don't suggest stubs packages where the runtime package now ships with types (Alex Waygood, PR [16226](https://github.com/python/mypy/pull/16226)) + +#### Performance Improvements + + * Speed up type argument checking (Jukka Lehtosalo, PR [16353](https://github.com/python/mypy/pull/16353)) + * Add fast path for checking self types (Jukka Lehtosalo, PR [16352](https://github.com/python/mypy/pull/16352)) + * Cache information about whether file is typeshed file (Jukka Lehtosalo, PR [16351](https://github.com/python/mypy/pull/16351)) + * Skip expensive `repr()` in logging call when not needed (Jukka Lehtosalo, PR [16350](https://github.com/python/mypy/pull/16350)) + +#### Attrs and Dataclass Improvements + + * `dataclass.replace`: Allow transformed classes (Ilya Priven, PR [15915](https://github.com/python/mypy/pull/15915)) + * `dataclass.replace`: Fall through to typeshed signature (Ilya Priven, PR [15962](https://github.com/python/mypy/pull/15962)) + * Document `dataclass_transform` behavior (Ilya Priven, PR [16017](https://github.com/python/mypy/pull/16017)) + * `attrs`: Remove fields type check (Ilya Priven, PR [15983](https://github.com/python/mypy/pull/15983)) + * `attrs`, `dataclasses`: Don't enforce slots when base class doesn't (Ilya Priven, PR [15976](https://github.com/python/mypy/pull/15976)) + * Fix crash on dataclass field / property collision (Nikita Sobolev, PR [16147](https://github.com/python/mypy/pull/16147)) + +#### Stubgen Improvements + + * Write stubs with utf-8 encoding (Jørgen Lind, PR [16329](https://github.com/python/mypy/pull/16329)) + * Fix missing property setter in semantic analysis mode (Ali Hamdan, PR [16303](https://github.com/python/mypy/pull/16303)) + * Unify C extension and pure python stub generators with object oriented design (Chad Dombrova, PR [15770](https://github.com/python/mypy/pull/15770)) + * Multiple fixes to the generated imports (Ali Hamdan, PR [15624](https://github.com/python/mypy/pull/15624)) + * Generate valid dataclass stubs (Ali Hamdan, PR [15625](https://github.com/python/mypy/pull/15625)) + +#### Fixes to Crashes + + * Fix incremental mode crash on TypedDict in method (Ivan Levkivskyi, PR [16364](https://github.com/python/mypy/pull/16364)) + * Fix crash on star unpack in TypedDict (Ivan Levkivskyi, PR [16116](https://github.com/python/mypy/pull/16116)) + * Fix crash on malformed TypedDict in incremental mode (Ivan Levkivskyi, PR [16115](https://github.com/python/mypy/pull/16115)) + * Fix crash with report generation on namespace packages (Shantanu, PR [16019](https://github.com/python/mypy/pull/16019)) + * Fix crash when parsing error code config with typo (Shantanu, PR [16005](https://github.com/python/mypy/pull/16005)) + * Fix `__post_init__()` internal error (Ilya Priven, PR [16080](https://github.com/python/mypy/pull/16080)) + +#### Documentation Updates + + * Make it easier to copy commands from README (Hamir Mahal, PR [16133](https://github.com/python/mypy/pull/16133)) + * Document and rename `[overload-overlap]` error code (Shantanu, PR [16074](https://github.com/python/mypy/pull/16074)) + * Document `--force-uppercase-builtins` and `--force-union-syntax` (Nikita Sobolev, PR [16049](https://github.com/python/mypy/pull/16049)) + * Document `force_union_syntax` and `force_uppercase_builtins` (Nikita Sobolev, PR [16048](https://github.com/python/mypy/pull/16048)) + * Document we're not tracking relationships between symbols (Ilya Priven, PR [16018](https://github.com/python/mypy/pull/16018)) #### Other Notable Changes and Fixes -... + + * Propagate narrowed types to lambda expressions (Ivan Levkivskyi, PR [16407](https://github.com/python/mypy/pull/16407)) + * Avoid importing from `setuptools._distutils` (Shantanu, PR [16348](https://github.com/python/mypy/pull/16348)) + * Delete recursive aliases flags (Ivan Levkivskyi, PR [16346](https://github.com/python/mypy/pull/16346)) + * Properly use proper subtyping for callables (Ivan Levkivskyi, PR [16343](https://github.com/python/mypy/pull/16343)) + * Use upper bound as inference fallback more consistently (Ivan Levkivskyi, PR [16344](https://github.com/python/mypy/pull/16344)) + * Add `[unimported-reveal]` error code (Nikita Sobolev, PR [16271](https://github.com/python/mypy/pull/16271)) + * Add `|=` and `|` operators support for `TypedDict` (Nikita Sobolev, PR [16249](https://github.com/python/mypy/pull/16249)) + * Clarify variance convention for Parameters (Ivan Levkivskyi, PR [16302](https://github.com/python/mypy/pull/16302)) + * Correctly recognize `typing_extensions.NewType` (Ganden Schaffner, PR [16298](https://github.com/python/mypy/pull/16298)) + * Fix partially defined in the case of missing type maps (Shantanu, PR [15995](https://github.com/python/mypy/pull/15995)) + * Use SPDX license identifier (Nikita Sobolev, PR [16230](https://github.com/python/mypy/pull/16230)) + * Make `__qualname__` and `__module__` available in class bodies (Anthony Sottile, PR [16215](https://github.com/python/mypy/pull/16215)) + * stubtest: Hint when args in stub need to be keyword-only (Alex Waygood, PR [16210](https://github.com/python/mypy/pull/16210)) + * Tuple slice should not propagate fallback (Thomas Grainger, PR [16154](https://github.com/python/mypy/pull/16154)) + * Fix cases of type object handling for overloads (Shantanu, PR [16168](https://github.com/python/mypy/pull/16168)) + * Fix walrus interaction with empty collections (Ivan Levkivskyi, PR [16197](https://github.com/python/mypy/pull/16197)) + * Use type variable bound when it appears as actual during inference (Ivan Levkivskyi, PR [16178](https://github.com/python/mypy/pull/16178)) + * Use upper bounds as fallback solutions for inference (Ivan Levkivskyi, PR [16184](https://github.com/python/mypy/pull/16184)) + * Special-case type inference of empty collections (Ivan Levkivskyi, PR [16122](https://github.com/python/mypy/pull/16122)) + * Allow TypedDict unpacking in Callable types (Ivan Levkivskyi, PR [16083](https://github.com/python/mypy/pull/16083)) + * Fix inference for overloaded `__call__` with generic self (Shantanu, PR [16053](https://github.com/python/mypy/pull/16053)) + * Call dynamic class hook on generic classes (Petter Friberg, PR [16052](https://github.com/python/mypy/pull/16052)) + * Preserve implicitly exported types via attribute access (Shantanu, PR [16129](https://github.com/python/mypy/pull/16129)) + * Fix a stubtest bug (Alex Waygood) + * Fix `tuple[Any, ...]` subtyping (Shantanu, PR [16108](https://github.com/python/mypy/pull/16108)) + * Lenient handling of trivial Callable suffixes (Ivan Levkivskyi, PR [15913](https://github.com/python/mypy/pull/15913)) + * Add `add_overloaded_method_to_class` helper for plugins (Nikita Sobolev, PR [16038](https://github.com/python/mypy/pull/16038)) + * Bundle `misc/proper_plugin.py` as a part of `mypy` (Nikita Sobolev, PR [16036](https://github.com/python/mypy/pull/16036)) + * Fix `case Any()` in match statement (DS/Charlie, PR [14479](https://github.com/python/mypy/pull/14479)) + * Make iterable logic more consistent (Shantanu, PR [16006](https://github.com/python/mypy/pull/16006)) + * Fix inference for properties with `__call__` (Shantanu, PR [15926](https://github.com/python/mypy/pull/15926)) + +#### Typeshed Updates + +Please see [git log](https://github.com/python/typeshed/commits/main?after=4a854366e03dee700109f8e758a08b2457ea2f51+0&branch=main&path=stdlib) for full list of standard library typeshed stub changes. #### Acknowledgements -... + +Thanks to all mypy contributors who contributed to this release: + +* Albert Tugushev +* Alex Waygood +* Ali Hamdan +* Anders Kaseorg +* Anthony Sottile +* Chad Dombrova +* Cibin Mathew +* dinaldoap +* DS/Charlie +* Eli Schwartz +* Ganden Schaffner +* Hamir Mahal +* Ihor +* Ikko Eltociear Ashimine +* Ilya Priven +* Ivan Levkivskyi +* Jelle Zijlstra +* Jukka Lehtosalo +* Jørgen Lind +* KotlinIsland +* Matt Bogosian +* Nikita Sobolev +* Petter Friberg +* Randolf Scholz +* Shantanu +* Thomas Grainger +* Valentin Stanciu + +I’d also like to thank my employer, Dropbox, for supporting mypy development. + +Posted by Jukka Lehtosalo ## Mypy 1.6 diff --git a/docs/requirements-docs.txt b/docs/requirements-docs.txt index 395964ad9d44..a3504b07824d 100644 --- a/docs/requirements-docs.txt +++ b/docs/requirements-docs.txt @@ -1,2 +1,2 @@ -sphinx>=4.2.0,<5.0.0 +sphinx>=5.1.0 furo>=2022.3.4 diff --git a/docs/source/class_basics.rst b/docs/source/class_basics.rst index 73f95f1c5658..1d80da5830ec 100644 --- a/docs/source/class_basics.rst +++ b/docs/source/class_basics.rst @@ -263,7 +263,7 @@ effect at runtime: Abstract base classes and multiple inheritance ********************************************** -Mypy supports Python :doc:`abstract base classes ` (ABCs). Abstract classes +Mypy supports Python :doc:`abstract base classes ` (ABCs). Abstract classes have at least one abstract method or property that must be implemented by any *concrete* (non-abstract) subclass. You can define abstract base classes using the :py:class:`abc.ABCMeta` metaclass and the :py:func:`@abc.abstractmethod ` @@ -371,8 +371,7 @@ property or an instance variable. Slots ***** -When a class has explicitly defined -`__slots__ `_, +When a class has explicitly defined :std:term:`__slots__`, mypy will check that all attributes assigned to are members of ``__slots__``: .. code-block:: python diff --git a/docs/source/command_line.rst b/docs/source/command_line.rst index a810c35cb77f..09836e2ffd20 100644 --- a/docs/source/command_line.rst +++ b/docs/source/command_line.rst @@ -787,7 +787,7 @@ in error messages. disable reporting most additional errors. The limit only applies if it seems likely that most of the remaining errors will not be useful or they may be overly noisy. If ``N`` is negative, there is - no limit. The default limit is 200. + no limit. The default limit is -1. .. option:: --force-uppercase-builtins diff --git a/docs/source/config_file.rst b/docs/source/config_file.rst index b5ce23ff11ec..de769200bf2b 100644 --- a/docs/source/config_file.rst +++ b/docs/source/config_file.rst @@ -238,10 +238,8 @@ section of the command line docs. Crafting a single regular expression that excludes multiple files while remaining human-readable can be a challenge. The above example demonstrates one approach. ``(?x)`` enables the ``VERBOSE`` flag for the subsequent regular expression, which - `ignores most whitespace and supports comments`__. The above is equivalent to: - ``(^one\.py$|two\.pyi$|^three\.)``. - - .. __: https://docs.python.org/3/library/re.html#re.X + :py:data:`ignores most whitespace and supports comments `. + The above is equivalent to: ``(^one\.py$|two\.pyi$|^three\.)``. For more details, see :option:`--exclude `. diff --git a/docs/source/error_code_list2.rst b/docs/source/error_code_list2.rst index cc5c9b0a1bc6..60f870c57db9 100644 --- a/docs/source/error_code_list2.rst +++ b/docs/source/error_code_list2.rst @@ -482,6 +482,37 @@ Example: def g(self, y: int) -> None: pass +.. _code-mutable-override: + +Check that overrides of mutable attributes are safe +--------------------------------------------------- + +This will enable the check for unsafe overrides of mutable attributes. For +historical reasons, and because this is a relatively common pattern in Python, +this check is not enabled by default. The example below is unsafe, and will be +flagged when this error code is enabled: + +.. code-block:: python + + from typing import Any + + class C: + x: float + y: float + z: float + + class D(C): + x: int # Error: Covariant override of a mutable attribute + # (base class "C" defined the type as "float", + # expression has type "int") [mutable-override] + y: float # OK + z: Any # OK + + def f(c: C) -> None: + c.x = 1.1 + d = D() + f(d) + d.x >> 1 # This will crash at runtime, because d.x is now float, not an int .. _code-unimported-reveal: @@ -493,8 +524,7 @@ that only existed during type-checking. In runtime it fails with expected ``NameError``, which can cause real problem in production, hidden from mypy. -But, in Python3.11 ``reveal_type`` -`was added to typing.py `_. +But, in Python3.11 :py:func:`typing.reveal_type` was added. ``typing_extensions`` ported this helper to all supported Python versions. Now users can actually import ``reveal_type`` to make the runtime code safe. diff --git a/docs/source/getting_started.rst b/docs/source/getting_started.rst index 463c73b2fe76..7ea4ddd148ea 100644 --- a/docs/source/getting_started.rst +++ b/docs/source/getting_started.rst @@ -256,8 +256,7 @@ Mypy can also understand how to work with types from libraries that you use. For instance, mypy comes out of the box with an intimate knowledge of the Python standard library. For example, here is a function which uses the -``Path`` object from the -`pathlib standard library module `_: +``Path`` object from the :doc:`pathlib standard library module `: .. code-block:: python diff --git a/docs/source/html_builder.py b/docs/source/html_builder.py index 3064833b5631..ea3594e0617b 100644 --- a/docs/source/html_builder.py +++ b/docs/source/html_builder.py @@ -9,11 +9,12 @@ from sphinx.addnodes import document from sphinx.application import Sphinx from sphinx.builders.html import StandaloneHTMLBuilder +from sphinx.environment import BuildEnvironment class MypyHTMLBuilder(StandaloneHTMLBuilder): - def __init__(self, app: Sphinx) -> None: - super().__init__(app) + def __init__(self, app: Sphinx, env: BuildEnvironment) -> None: + super().__init__(app, env) self._ref_to_doc = {} def write_doc(self, docname: str, doctree: document) -> None: diff --git a/docs/source/more_types.rst b/docs/source/more_types.rst index b27764a9e87c..cb3ef64b39a7 100644 --- a/docs/source/more_types.rst +++ b/docs/source/more_types.rst @@ -829,7 +829,7 @@ Typing async/await Mypy lets you type coroutines that use the ``async/await`` syntax. For more information regarding coroutines, see :pep:`492` and the -`asyncio documentation `_. +`asyncio documentation `_. Functions defined using ``async def`` are typed similar to normal functions. The return type annotation should be the same as the type of the value you diff --git a/misc/dump-ast.py b/misc/dump-ast.py index 6f70bbc8c9ed..7fdf905bae0b 100755 --- a/misc/dump-ast.py +++ b/misc/dump-ast.py @@ -9,7 +9,7 @@ import sys from mypy import defaults -from mypy.errors import CompileError +from mypy.errors import CompileError, Errors from mypy.options import Options from mypy.parse import parse @@ -19,7 +19,7 @@ def dump(fname: str, python_version: tuple[int, int], quiet: bool = False) -> No options.python_version = python_version with open(fname, "rb") as f: s = f.read() - tree = parse(s, fname, None, errors=None, options=options) + tree = parse(s, fname, None, errors=Errors(options), options=options) if not quiet: print(tree) diff --git a/misc/gen_blog_post_html.py b/misc/gen_blog_post_html.py new file mode 100644 index 000000000000..7170696d5d09 --- /dev/null +++ b/misc/gen_blog_post_html.py @@ -0,0 +1,171 @@ +"""Converter from CHANGELOG.md (Markdown) to HTML suitable for a mypy blog post. + +How to use: + +1. Write release notes in CHANGELOG.md. +2. Make sure the heading for the next release is of form `## Mypy X.Y`. +2. Run `misc/gen_blog_post_html.py X.Y > target.html`. +4. Manually inspect and tweak the result. + +Notes: + +* There are some fragile assumptions. Double check the output. +""" + +import argparse +import html +import os +import re +import sys + + +def format_lists(h: str) -> str: + a = h.splitlines() + r = [] + i = 0 + bullets = ("- ", "* ", " * ") + while i < len(a): + if a[i].startswith(bullets): + r.append("

    ") + while i < len(a) and a[i].startswith(bullets): + r.append("
  • %s" % a[i][2:].lstrip()) + i += 1 + r.append("
") + else: + r.append(a[i]) + i += 1 + return "\n".join(r) + + +def format_code(h: str) -> str: + a = h.splitlines() + r = [] + i = 0 + while i < len(a): + if a[i].startswith(" ") or a[i].startswith("```"): + indent = a[i].startswith(" ") + if not indent: + i += 1 + r.append("
")
+            while i < len(a) and (
+                (indent and a[i].startswith("    ")) or (not indent and not a[i].startswith("```"))
+            ):
+                # Undo > and <
+                line = a[i].replace(">", ">").replace("<", "<")
+                if not indent:
+                    line = "    " + line
+                r.append(html.escape(line))
+                i += 1
+            r.append("
") + if not indent and a[i].startswith("```"): + i += 1 + else: + r.append(a[i]) + i += 1 + return "\n".join(r) + + +def convert(src: str) -> str: + h = src + + # Replace < and >. + h = re.sub(r"<", "<", h) + h = re.sub(r">", ">", h) + + # Title + h = re.sub(r"^## (Mypy [0-9.]+)", r"

\1 Released

", h, flags=re.MULTILINE) + + # Subheadings + h = re.sub(r"\n#### ([A-Z`].*)\n", r"\n

\1

\n", h) + + # Sub-subheadings + h = re.sub(r"\n\*\*([A-Z_`].*)\*\*\n", r"\n

\1

\n", h) + h = re.sub(r"\n`\*\*([A-Z_`].*)\*\*\n", r"\n

`\1

\n", h) + + # Translate `**` + h = re.sub(r"`\*\*`", "**", h) + + # Paragraphs + h = re.sub(r"\n([A-Z])", r"\n

\1", h) + + # Bullet lists + h = format_lists(h) + + # Code blocks + h = format_code(h) + + # Code fragments + h = re.sub(r"`([^`]+)`", r"\1", h) + + # Remove **** noise + h = re.sub(r"\*\*\*\*", "", h) + + # Bold text + h = re.sub(r"\*\*([A-Za-z].*?)\*\*", r" \1", h) + + # Emphasized text + h = re.sub(r" \*([A-Za-z].*?)\*", r" \1", h) + + # Remove redundant PR links to avoid double links (they will be generated below) + h = re.sub(r"\[(#[0-9]+)\]\(https://github.com/python/mypy/pull/[0-9]+/?\)", r"\1", h) + + # Issue and PR links + h = re.sub(r"\((#[0-9]+)\) +\(([^)]+)\)", r"(\2, \1)", h) + h = re.sub( + r"fixes #([0-9]+)", + r'fixes issue \1', + h, + ) + h = re.sub(r"#([0-9]+)", r'PR \1', h) + h = re.sub(r"\) \(PR", ", PR", h) + + # Markdown links + h = re.sub(r"\[([^]]*)\]\(([^)]*)\)", r'\1', h) + + # Add random links in case they are missing + h = re.sub( + r"contributors to typeshed:", + 'contributors to typeshed:', + h, + ) + + # Add missing top-level HTML tags + h = '\n\n\n' + h + "\n" + + return h + + +def extract_version(src: str, version: str) -> str: + a = src.splitlines() + i = 0 + heading = f"## Mypy {version}" + while i < len(a): + if a[i].strip() == heading: + break + i += 1 + else: + raise RuntimeError(f"Can't find heading {heading!r}") + j = i + 1 + while not a[j].startswith("## "): + j += 1 + return "\n".join(a[i:j]) + + +def main() -> None: + parser = argparse.ArgumentParser( + description="Generate HTML release blog post based on CHANGELOG.md and write to stdout." + ) + parser.add_argument("version", help="mypy version, in form X.Y or X.Y.Z") + args = parser.parse_args() + version: str = args.version + if not re.match(r"[0-9]+(\.[0-9]+)+$", version): + sys.exit(f"error: Version must be of form X.Y or X.Y.Z, not {version!r}") + changelog_path = os.path.join(os.path.dirname(__file__), os.path.pardir, "CHANGELOG.md") + src = open(changelog_path).read() + src = extract_version(src, version) + dst = convert(src) + sys.stdout.write(dst) + + +if __name__ == "__main__": + main() diff --git a/misc/generate_changelog.py b/misc/generate_changelog.py new file mode 100644 index 000000000000..7c7f28b6eeb7 --- /dev/null +++ b/misc/generate_changelog.py @@ -0,0 +1,201 @@ +"""Generate the changelog for a mypy release.""" + +from __future__ import annotations + +import argparse +import re +import subprocess +import sys +from dataclasses import dataclass + + +def find_all_release_branches() -> list[tuple[int, int]]: + result = subprocess.run(["git", "branch", "-r"], text=True, capture_output=True, check=True) + versions = [] + for line in result.stdout.splitlines(): + line = line.strip() + if m := re.match(r"origin/release-([0-9]+)\.([0-9]+)$", line): + major = int(m.group(1)) + minor = int(m.group(2)) + versions.append((major, minor)) + return versions + + +def git_merge_base(rev1: str, rev2: str) -> str: + result = subprocess.run( + ["git", "merge-base", rev1, rev2], text=True, capture_output=True, check=True + ) + return result.stdout.strip() + + +@dataclass +class CommitInfo: + commit: str + author: str + title: str + pr_number: int | None + + +def normalize_author(author: str) -> str: + # Some ad-hoc rules to get more consistent author names. + if author == "AlexWaygood": + return "Alex Waygood" + elif author == "jhance": + return "Jared Hance" + return author + + +def git_commit_log(rev1: str, rev2: str) -> list[CommitInfo]: + result = subprocess.run( + ["git", "log", "--pretty=%H\t%an\t%s", f"{rev1}..{rev2}"], + text=True, + capture_output=True, + check=True, + ) + commits = [] + for line in result.stdout.splitlines(): + commit, author, title = line.strip().split("\t", 2) + pr_number = None + if m := re.match(r".*\(#([0-9]+)\) *$", title): + pr_number = int(m.group(1)) + title = re.sub(r" *\(#[0-9]+\) *$", "", title) + + author = normalize_author(author) + entry = CommitInfo(commit, author, title, pr_number) + commits.append(entry) + return commits + + +def filter_omitted_commits(commits: list[CommitInfo]) -> list[CommitInfo]: + result = [] + for c in commits: + title = c.title + keep = True + if title.startswith("Sync typeshed"): + # Typeshed syncs aren't mentioned in release notes + keep = False + if title.startswith( + ( + "Revert sum literal integer change", + "Remove use of LiteralString in builtins", + "Revert typeshed ctypes change", + "Revert use of `ParamSpec` for `functools.wraps`", + ) + ): + # These are generated by a typeshed sync. + keep = False + if re.search(r"(bump|update).*version.*\+dev", title.lower()): + # Version number updates aren't mentioned + keep = False + if "pre-commit autoupdate" in title: + keep = False + if title.startswith(("Update commit hashes", "Update hashes")): + # Internal tool change + keep = False + if keep: + result.append(c) + return result + + +def normalize_title(title: str) -> str: + # We sometimes add a title prefix when cherry-picking commits to a + # release branch. Attempt to remove these prefixes so that we can + # match them to the corresponding master branch. + if m := re.match(r"\[release [0-9.]+\] *", title, flags=re.I): + title = title.replace(m.group(0), "") + return title + + +def filter_out_commits_from_old_release_branch( + new_commits: list[CommitInfo], old_commits: list[CommitInfo] +) -> list[CommitInfo]: + old_titles = {normalize_title(commit.title) for commit in old_commits} + result = [] + for commit in new_commits: + drop = False + if normalize_title(commit.title) in old_titles: + drop = True + if normalize_title(f"{commit.title} (#{commit.pr_number})") in old_titles: + drop = True + if not drop: + result.append(commit) + else: + print(f'NOTE: Drop "{commit.title}", since it was in previous release branch') + return result + + +def find_changes_between_releases(old_branch: str, new_branch: str) -> list[CommitInfo]: + merge_base = git_merge_base(old_branch, new_branch) + print(f"Merge base: {merge_base}") + new_commits = git_commit_log(merge_base, new_branch) + old_commits = git_commit_log(merge_base, old_branch) + + # Filter out some commits that won't be mentioned in release notes. + new_commits = filter_omitted_commits(new_commits) + + # Filter out commits cherry-picked to old branch. + new_commits = filter_out_commits_from_old_release_branch(new_commits, old_commits) + + return new_commits + + +def format_changelog_entry(c: CommitInfo) -> str: + """ + s = f" * {c.commit[:9]} - {c.title}" + if c.pr_number: + s += f" (#{c.pr_number})" + s += f" ({c.author})" + """ + s = f" * {c.title} ({c.author}" + if c.pr_number: + s += f", PR [{c.pr_number}](https://github.com/python/mypy/pull/{c.pr_number})" + s += ")" + + return s + + +def main() -> None: + parser = argparse.ArgumentParser() + parser.add_argument("version", help="target mypy version (form X.Y)") + parser.add_argument("--local", action="store_true") + args = parser.parse_args() + version: str = args.version + local: bool = args.local + + if not re.match(r"[0-9]+\.[0-9]+$", version): + sys.exit(f"error: Release must be of form X.Y (not {version!r})") + major, minor = (int(component) for component in version.split(".")) + + if not local: + print("Running 'git fetch' to fetch all release branches...") + subprocess.run(["git", "fetch"], check=True) + + if minor > 0: + prev_major = major + prev_minor = minor - 1 + else: + # For a x.0 release, the previous release is the most recent (x-1).y release. + all_releases = sorted(find_all_release_branches()) + if (major, minor) not in all_releases: + sys.exit(f"error: Can't find release branch for {major}.{minor} at origin") + for i in reversed(range(len(all_releases))): + if all_releases[i][0] == major - 1: + prev_major, prev_minor = all_releases[i] + break + else: + sys.exit("error: Could not determine previous release") + print(f"Generating changelog for {major}.{minor}") + print(f"Previous release was {prev_major}.{prev_minor}") + + new_branch = f"origin/release-{major}.{minor}" + old_branch = f"origin/release-{prev_major}.{prev_minor}" + + changes = find_changes_between_releases(old_branch, new_branch) + + print() + for c in changes: + print(format_changelog_entry(c)) + + +if __name__ == "__main__": + main() diff --git a/misc/sync-typeshed.py b/misc/sync-typeshed.py index 77f921a89b1b..9d6fd92270a5 100644 --- a/misc/sync-typeshed.py +++ b/misc/sync-typeshed.py @@ -179,10 +179,10 @@ def main() -> None: print("Created typeshed sync commit.") commits_to_cherry_pick = [ - "9859fe7ba", # LiteralString reverts - "378a866e9", # sum reverts - "2816b97d5", # ctypes reverts - "7d987a105", # ParamSpec for functools.wraps + "588623ff2", # LiteralString reverts + "bdcc90e85", # sum reverts + "3e5d81337", # ctypes reverts + "344298e3a", # ParamSpec for functools.wraps ] for commit in commits_to_cherry_pick: try: diff --git a/misc/test-stubgenc.sh b/misc/test-stubgenc.sh index 7713e1b04e43..5cb5140eba76 100755 --- a/misc/test-stubgenc.sh +++ b/misc/test-stubgenc.sh @@ -24,7 +24,7 @@ function stubgenc_test() { # Compare generated stubs to expected ones if ! git diff --exit-code "$STUBGEN_OUTPUT_FOLDER"; then - EXIT=$? + EXIT=1 fi } diff --git a/mypy/build.py b/mypy/build.py index 605368a6dc51..b3ca8d06916d 100644 --- a/mypy/build.py +++ b/mypy/build.py @@ -145,7 +145,7 @@ def build( sources: list[BuildSource], options: Options, alt_lib_path: str | None = None, - flush_errors: Callable[[list[str], bool], None] | None = None, + flush_errors: Callable[[str | None, list[str], bool], None] | None = None, fscache: FileSystemCache | None = None, stdout: TextIO | None = None, stderr: TextIO | None = None, @@ -177,7 +177,9 @@ def build( # fields for callers that want the traditional API. messages = [] - def default_flush_errors(new_messages: list[str], is_serious: bool) -> None: + def default_flush_errors( + filename: str | None, new_messages: list[str], is_serious: bool + ) -> None: messages.extend(new_messages) flush_errors = flush_errors or default_flush_errors @@ -197,7 +199,7 @@ def default_flush_errors(new_messages: list[str], is_serious: bool) -> None: # Patch it up to contain either none or all none of the messages, # depending on whether we are flushing errors. serious = not e.use_stdout - flush_errors(e.messages, serious) + flush_errors(None, e.messages, serious) e.messages = messages raise @@ -206,7 +208,7 @@ def _build( sources: list[BuildSource], options: Options, alt_lib_path: str | None, - flush_errors: Callable[[list[str], bool], None], + flush_errors: Callable[[str | None, list[str], bool], None], fscache: FileSystemCache | None, stdout: TextIO, stderr: TextIO, @@ -600,7 +602,7 @@ def __init__( plugin: Plugin, plugins_snapshot: dict[str, str], errors: Errors, - flush_errors: Callable[[list[str], bool], None], + flush_errors: Callable[[str | None, list[str], bool], None], fscache: FileSystemCache, stdout: TextIO, stderr: TextIO, @@ -2172,8 +2174,8 @@ def parse_file(self, *, temporary: bool = False) -> None: self.id, self.xpath, source, - self.ignore_all or self.options.ignore_errors, - self.options, + ignore_errors=self.ignore_all or self.options.ignore_errors, + options=self.options, ) else: @@ -3458,7 +3460,11 @@ def process_stale_scc(graph: Graph, scc: list[str], manager: BuildManager) -> No for id in stale: graph[id].transitive_error = True for id in stale: - manager.flush_errors(manager.errors.file_messages(graph[id].xpath), False) + manager.flush_errors( + manager.errors.simplify_path(graph[id].xpath), + manager.errors.file_messages(graph[id].xpath), + False, + ) graph[id].write_cache() graph[id].mark_as_rechecked() diff --git a/mypy/checker.py b/mypy/checker.py index f51ba746ea75..979a55b223c9 100644 --- a/mypy/checker.py +++ b/mypy/checker.py @@ -1879,6 +1879,7 @@ def check_explicit_override_decorator( found_method_base_classes and not defn.is_explicit_override and defn.name not in ("__init__", "__new__") + and not is_private(defn.name) ): self.msg.explicit_override_decorator_missing( defn.name, found_method_base_classes[0].fullname, context or defn @@ -1921,7 +1922,7 @@ def check_method_or_accessor_override_for_base( base_attr = base.names.get(name) if base_attr: # First, check if we override a final (always an error, even with Any types). - if is_final_node(base_attr.node): + if is_final_node(base_attr.node) and not is_private(name): self.msg.cant_override_final(name, base.name, defn) # Second, final can't override anything writeable independently of types. if defn.is_final: @@ -2041,7 +2042,6 @@ def check_method_override_for_base_with_name( pass elif isinstance(original_type, FunctionLike) and isinstance(typ, FunctionLike): # Check that the types are compatible. - # TODO overloaded signatures self.check_override( typ, original_type, @@ -2056,7 +2056,6 @@ def check_method_override_for_base_with_name( # Assume invariance for a non-callable attribute here. Note # that this doesn't affect read-only properties which can have # covariant overrides. - # pass elif ( original_node @@ -2636,6 +2635,9 @@ class C(B, A[int]): ... # this is unsafe because... first_type = get_proper_type(self.determine_type_of_member(first)) second_type = get_proper_type(self.determine_type_of_member(second)) + # TODO: use more principled logic to decide is_subtype() vs is_equivalent(). + # We should rely on mutability of superclass node, not on types being Callable. + # start with the special case that Instance can be a subtype of FunctionLike call = None if isinstance(first_type, Instance): @@ -2679,7 +2681,7 @@ class C(B, A[int]): ... # this is unsafe because... ok = True # Final attributes can never be overridden, but can override # non-final read-only attributes. - if is_final_node(second.node): + if is_final_node(second.node) and not is_private(name): self.msg.cant_override_final(name, base2.name, ctx) if is_final_node(first.node): self.check_if_final_var_override_writable(name, second.node, ctx) @@ -3211,7 +3213,7 @@ def check_compatibility_super( if base_static and compare_static: lvalue_node.is_staticmethod = True - return self.check_subtype( + ok = self.check_subtype( compare_type, base_type, rvalue, @@ -3219,6 +3221,20 @@ def check_compatibility_super( "expression has type", f'base class "{base.name}" defined the type as', ) + if ( + ok + and codes.MUTABLE_OVERRIDE in self.options.enabled_error_codes + and self.is_writable_attribute(base_node) + ): + ok = self.check_subtype( + base_type, + compare_type, + rvalue, + message_registry.COVARIANT_OVERRIDE_OF_MUTABLE_ATTRIBUTE, + f'base class "{base.name}" defined the type as', + "expression has type", + ) + return ok return True def lvalue_type_from_base( @@ -3293,6 +3309,8 @@ def check_compatibility_final_super( """ if not isinstance(base_node, (Var, FuncBase, Decorator)): return True + if is_private(node.name): + return True if base_node.is_final and (node.is_final or not isinstance(base_node, Var)): # Give this error only for explicit override attempt with `Final`, or # if we are overriding a final method with variable. @@ -5236,6 +5254,15 @@ def _make_fake_typeinfo_and_full_name( pretty_names_list = pretty_seq( format_type_distinctly(*base_classes, options=self.options, bare=True), "and" ) + + new_errors = [] + for base in base_classes: + if base.type.is_final: + new_errors.append((pretty_names_list, f'"{base.type.name}" is final')) + if new_errors: + errors.extend(new_errors) + return None + try: info, full_name = _make_fake_typeinfo_and_full_name(base_classes, curr_module) with self.msg.filter_errors() as local_errors: @@ -5248,10 +5275,10 @@ def _make_fake_typeinfo_and_full_name( self.check_multiple_inheritance(info) info.is_intersection = True except MroError: - errors.append((pretty_names_list, "inconsistent method resolution order")) + errors.append((pretty_names_list, "would have inconsistent method resolution order")) return None if local_errors.has_new_errors(): - errors.append((pretty_names_list, "incompatible method signatures")) + errors.append((pretty_names_list, "would have incompatible method signatures")) return None curr_module.names[full_name] = SymbolTableNode(GDEF, info) @@ -5652,22 +5679,29 @@ def find_isinstance_check_helper(self, node: Expression) -> tuple[TypeMap, TypeM if node.arg_kinds[0] != nodes.ARG_POS: # the first argument might be used as a kwarg called_type = get_proper_type(self.lookup_type(node.callee)) - assert isinstance(called_type, (CallableType, Overloaded)) + + # TODO: there are some more cases in check_call() to handle. + if isinstance(called_type, Instance): + call = find_member( + "__call__", called_type, called_type, is_operator=True + ) + if call is not None: + called_type = get_proper_type(call) # *assuming* the overloaded function is correct, there's a couple cases: # 1) The first argument has different names, but is pos-only. We don't # care about this case, the argument must be passed positionally. # 2) The first argument allows keyword reference, therefore must be the # same between overloads. - name = called_type.items[0].arg_names[0] - - if name in node.arg_names: - idx = node.arg_names.index(name) - # we want the idx-th variable to be narrowed - expr = collapse_walrus(node.args[idx]) - else: - self.fail(message_registry.TYPE_GUARD_POS_ARG_REQUIRED, node) - return {}, {} + if isinstance(called_type, (CallableType, Overloaded)): + name = called_type.items[0].arg_names[0] + if name in node.arg_names: + idx = node.arg_names.index(name) + # we want the idx-th variable to be narrowed + expr = collapse_walrus(node.args[idx]) + else: + self.fail(message_registry.TYPE_GUARD_POS_ARG_REQUIRED, node) + return {}, {} if literal(expr) == LITERAL_TYPE: # Note: we wrap the target type, so that we can special case later. # Namely, for isinstance() we use a normal meet, while TypeGuard is diff --git a/mypy/checkexpr.py b/mypy/checkexpr.py index 0207c245b1f9..626584bc3a20 100644 --- a/mypy/checkexpr.py +++ b/mypy/checkexpr.py @@ -2440,34 +2440,28 @@ def check_argument_types( # the suffices to the tuple, e.g. a single actual like # Tuple[Unpack[Ts], int] expanded_tuple = False + actual_kinds = [arg_kinds[a] for a in actuals] if len(actuals) > 1: - first_actual_arg_type = get_proper_type(arg_types[actuals[0]]) + p_actual_type = get_proper_type(arg_types[actuals[0]]) if ( - isinstance(first_actual_arg_type, TupleType) - and len(first_actual_arg_type.items) == 1 - and isinstance(first_actual_arg_type.items[0], UnpackType) + isinstance(p_actual_type, TupleType) + and len(p_actual_type.items) == 1 + and isinstance(p_actual_type.items[0], UnpackType) + and actual_kinds == [nodes.ARG_STAR] + [nodes.ARG_POS] * (len(actuals) - 1) ): - # TODO: use walrus operator - actual_types = [first_actual_arg_type.items[0]] + [ - arg_types[a] for a in actuals[1:] - ] - actual_kinds = [nodes.ARG_STAR] + [nodes.ARG_POS] * (len(actuals) - 1) - - # If we got here, the callee was previously inferred to have a suffix. - assert isinstance(orig_callee_arg_type, UnpackType) - assert isinstance(orig_callee_arg_type.type, ProperType) and isinstance( - orig_callee_arg_type.type, TupleType - ) - assert orig_callee_arg_type.type.items - callee_arg_types = orig_callee_arg_type.type.items - callee_arg_kinds = [nodes.ARG_STAR] + [nodes.ARG_POS] * ( - len(orig_callee_arg_type.type.items) - 1 - ) - expanded_tuple = True + actual_types = [p_actual_type.items[0]] + [arg_types[a] for a in actuals[1:]] + if isinstance(orig_callee_arg_type, UnpackType): + p_callee_type = get_proper_type(orig_callee_arg_type.type) + if isinstance(p_callee_type, TupleType): + assert p_callee_type.items + callee_arg_types = p_callee_type.items + callee_arg_kinds = [nodes.ARG_STAR] + [nodes.ARG_POS] * ( + len(p_callee_type.items) - 1 + ) + expanded_tuple = True if not expanded_tuple: actual_types = [arg_types[a] for a in actuals] - actual_kinds = [arg_kinds[a] for a in actuals] if isinstance(orig_callee_arg_type, UnpackType): unpacked_type = get_proper_type(orig_callee_arg_type.type) if isinstance(unpacked_type, TupleType): @@ -3623,8 +3617,9 @@ def dangerous_comparison( self, left: Type, right: Type, - original_container: Type | None = None, *, + original_container: Type | None = None, + seen_types: set[tuple[Type, Type]] | None = None, prefer_literal: bool = True, ) -> bool: """Check for dangerous non-overlapping comparisons like 42 == 'no'. @@ -3645,6 +3640,12 @@ def dangerous_comparison( if not self.chk.options.strict_equality: return False + if seen_types is None: + seen_types = set() + if (left, right) in seen_types: + return False + seen_types.add((left, right)) + left, right = get_proper_types((left, right)) # We suppress the error if there is a custom __eq__() method on either @@ -3700,17 +3701,21 @@ def dangerous_comparison( abstract_set = self.chk.lookup_typeinfo("typing.AbstractSet") left = map_instance_to_supertype(left, abstract_set) right = map_instance_to_supertype(right, abstract_set) - return self.dangerous_comparison(left.args[0], right.args[0]) + return self.dangerous_comparison( + left.args[0], right.args[0], seen_types=seen_types + ) elif left.type.has_base("typing.Mapping") and right.type.has_base("typing.Mapping"): # Similar to above: Mapping ignores the classes, it just compares items. abstract_map = self.chk.lookup_typeinfo("typing.Mapping") left = map_instance_to_supertype(left, abstract_map) right = map_instance_to_supertype(right, abstract_map) return self.dangerous_comparison( - left.args[0], right.args[0] - ) or self.dangerous_comparison(left.args[1], right.args[1]) + left.args[0], right.args[0], seen_types=seen_types + ) or self.dangerous_comparison(left.args[1], right.args[1], seen_types=seen_types) elif left_name in ("builtins.list", "builtins.tuple") and right_name == left_name: - return self.dangerous_comparison(left.args[0], right.args[0]) + return self.dangerous_comparison( + left.args[0], right.args[0], seen_types=seen_types + ) elif left_name in OVERLAPPING_BYTES_ALLOWLIST and right_name in ( OVERLAPPING_BYTES_ALLOWLIST ): @@ -4908,7 +4913,7 @@ def tuple_context_matches(self, expr: TupleExpr, ctx: TupleType) -> bool: return len([e for e in expr.items if not isinstance(e, StarExpr)]) <= len(ctx.items) # For variadic context, the only easy case is when structure matches exactly. # TODO: try using tuple type context in more cases. - if len([e for e in expr.items if not isinstance(e, StarExpr)]) != 1: + if len([e for e in expr.items if isinstance(e, StarExpr)]) != 1: return False expr_star_index = next(i for i, lv in enumerate(expr.items) if isinstance(lv, StarExpr)) return len(expr.items) == len(ctx.items) and ctx_unpack_index == expr_star_index @@ -4947,6 +4952,9 @@ def visit_tuple_expr(self, e: TupleExpr) -> Type: if type_context_items is not None: unpack_in_context = find_unpack_in_list(type_context_items) is not None seen_unpack_in_items = False + allow_precise_tuples = ( + unpack_in_context or PRECISE_TUPLE_TYPES in self.chk.options.enable_incomplete_feature + ) # Infer item types. Give up if there's a star expression # that's not a Tuple. @@ -4987,10 +4995,7 @@ def visit_tuple_expr(self, e: TupleExpr) -> Type: # result in an error later, just do something predictable here. j += len(tt.items) else: - if ( - PRECISE_TUPLE_TYPES in self.chk.options.enable_incomplete_feature - and not seen_unpack_in_items - ): + if allow_precise_tuples and not seen_unpack_in_items: # Handle (x, *y, z), where y is e.g. tuple[Y, ...]. if isinstance(tt, Instance) and self.chk.type_is_iterable(tt): item_type = self.chk.iterable_item_type(tt, e) @@ -5201,7 +5206,8 @@ def visit_lambda_expr(self, e: LambdaExpr) -> Type: else: # Type context available. self.chk.return_types.append(inferred_type.ret_type) - self.chk.check_func_item(e, type_override=type_override) + with self.chk.tscope.function_scope(e): + self.chk.check_func_item(e, type_override=type_override) if not self.chk.has_type(e.expr()): # TODO: return expression must be accepted before exiting function scope. self.accept(e.expr(), allow_none_return=True) @@ -6203,11 +6209,16 @@ class PolyTranslator(TypeTranslator): See docstring for apply_poly() for details. """ - def __init__(self, poly_tvars: Sequence[TypeVarLikeType]) -> None: + def __init__( + self, + poly_tvars: Iterable[TypeVarLikeType], + bound_tvars: frozenset[TypeVarLikeType] = frozenset(), + seen_aliases: frozenset[TypeInfo] = frozenset(), + ) -> None: self.poly_tvars = set(poly_tvars) # This is a simplified version of TypeVarScope used during semantic analysis. - self.bound_tvars: set[TypeVarLikeType] = set() - self.seen_aliases: set[TypeInfo] = set() + self.bound_tvars = bound_tvars + self.seen_aliases = seen_aliases def collect_vars(self, t: CallableType | Parameters) -> list[TypeVarLikeType]: found_vars = [] @@ -6283,10 +6294,11 @@ def visit_instance(self, t: Instance) -> Type: if t.args and t.type.is_protocol and t.type.protocol_members == ["__call__"]: if t.type in self.seen_aliases: raise PolyTranslationError() - self.seen_aliases.add(t.type) call = find_member("__call__", t, t, is_operator=True) assert call is not None - return call.accept(self) + return call.accept( + PolyTranslator(self.poly_tvars, self.bound_tvars, self.seen_aliases | {t.type}) + ) return super().visit_instance(t) diff --git a/mypy/config_parser.py b/mypy/config_parser.py index 4dbd6477c81e..a6bf021000c1 100644 --- a/mypy/config_parser.py +++ b/mypy/config_parser.py @@ -152,6 +152,17 @@ def check_follow_imports(choice: str) -> str: return choice +def check_junit_format(choice: str) -> str: + choices = ["global", "per_file"] + if choice not in choices: + raise argparse.ArgumentTypeError( + "invalid choice '{}' (choose from {})".format( + choice, ", ".join(f"'{x}'" for x in choices) + ) + ) + return choice + + def split_commas(value: str) -> list[str]: # Uses a bit smarter technique to allow last trailing comma # and to remove last `""` item from the split. @@ -173,6 +184,7 @@ def split_commas(value: str) -> list[str]: "files": split_and_match_files, "quickstart_file": expand_path, "junit_xml": expand_path, + "junit_format": check_junit_format, "follow_imports": check_follow_imports, "no_site_packages": bool, "plugins": lambda s: [p.strip() for p in split_commas(s)], @@ -200,6 +212,7 @@ def split_commas(value: str) -> list[str]: "python_version": parse_version, "mypy_path": lambda s: [expand_path(p) for p in try_split(s, "[,:]")], "files": lambda s: split_and_match_files_list(try_split(s)), + "junit_format": lambda s: check_junit_format(str(s)), "follow_imports": lambda s: check_follow_imports(str(s)), "plugins": try_split, "always_true": try_split, diff --git a/mypy/constraints.py b/mypy/constraints.py index 49e542a49e56..d6a4b28799e5 100644 --- a/mypy/constraints.py +++ b/mypy/constraints.py @@ -226,25 +226,22 @@ def infer_constraints_for_callable( actual_type = mapper.expand_actual_type( actual_arg_type, arg_kinds[actual], callee.arg_names[i], callee.arg_kinds[i] ) - if ( - param_spec - and callee.arg_kinds[i] in (ARG_STAR, ARG_STAR2) - and not incomplete_star_mapping - ): + if param_spec and callee.arg_kinds[i] in (ARG_STAR, ARG_STAR2): # If actual arguments are mapped to ParamSpec type, we can't infer individual # constraints, instead store them and infer single constraint at the end. # It is impossible to map actual kind to formal kind, so use some heuristic. # This inference is used as a fallback, so relying on heuristic should be OK. - param_spec_arg_types.append( - mapper.expand_actual_type( - actual_arg_type, arg_kinds[actual], None, arg_kinds[actual] + if not incomplete_star_mapping: + param_spec_arg_types.append( + mapper.expand_actual_type( + actual_arg_type, arg_kinds[actual], None, arg_kinds[actual] + ) ) - ) - actual_kind = arg_kinds[actual] - param_spec_arg_kinds.append( - ARG_POS if actual_kind not in (ARG_STAR, ARG_STAR2) else actual_kind - ) - param_spec_arg_names.append(arg_names[actual] if arg_names else None) + actual_kind = arg_kinds[actual] + param_spec_arg_kinds.append( + ARG_POS if actual_kind not in (ARG_STAR, ARG_STAR2) else actual_kind + ) + param_spec_arg_names.append(arg_names[actual] if arg_names else None) else: c = infer_constraints(callee.arg_types[i], actual_type, SUPERTYPE_OF) constraints.extend(c) @@ -267,6 +264,9 @@ def infer_constraints_for_callable( ), ) ) + if any(isinstance(v, ParamSpecType) for v in callee.variables): + # As a perf optimization filter imprecise constraints only when we can have them. + constraints = filter_imprecise_kinds(constraints) return constraints @@ -949,7 +949,7 @@ def visit_instance(self, template: Instance) -> list[Constraint]: for item in actual.items: if isinstance(item, UnpackType): unpacked = get_proper_type(item.type) - if isinstance(unpacked, TypeVarType): + if isinstance(unpacked, TypeVarTupleType): # Cannot infer anything for T from [T, ...] <: *Ts continue assert ( @@ -1094,29 +1094,18 @@ def visit_callable_type(self, template: CallableType) -> list[Constraint]: ) param_spec_target: Type | None = None - skip_imprecise = ( - any(c.type_var == param_spec.id for c in res) and cactual.imprecise_arg_kinds - ) if not cactual_ps: max_prefix_len = len([k for k in cactual.arg_kinds if k in (ARG_POS, ARG_OPT)]) prefix_len = min(prefix_len, max_prefix_len) - # This logic matches top-level callable constraint exception, if we managed - # to get other constraints for ParamSpec, don't infer one with imprecise kinds - if not skip_imprecise: - param_spec_target = Parameters( - arg_types=cactual.arg_types[prefix_len:], - arg_kinds=cactual.arg_kinds[prefix_len:], - arg_names=cactual.arg_names[prefix_len:], - variables=cactual.variables - if not type_state.infer_polymorphic - else [], - imprecise_arg_kinds=cactual.imprecise_arg_kinds, - ) + param_spec_target = Parameters( + arg_types=cactual.arg_types[prefix_len:], + arg_kinds=cactual.arg_kinds[prefix_len:], + arg_names=cactual.arg_names[prefix_len:], + variables=cactual.variables if not type_state.infer_polymorphic else [], + imprecise_arg_kinds=cactual.imprecise_arg_kinds, + ) else: - if ( - len(param_spec.prefix.arg_types) <= len(cactual_ps.prefix.arg_types) - and not skip_imprecise - ): + if len(param_spec.prefix.arg_types) <= len(cactual_ps.prefix.arg_types): param_spec_target = cactual_ps.copy_modified( prefix=Parameters( arg_types=cactual_ps.prefix.arg_types[prefix_len:], @@ -1611,3 +1600,24 @@ def infer_callable_arguments_constraints( infer_directed_arg_constraints(left_by_name.typ, right_by_name.typ, direction) ) return res + + +def filter_imprecise_kinds(cs: list[Constraint]) -> list[Constraint]: + """For each ParamSpec remove all imprecise constraints, if at least one precise available.""" + have_precise = set() + for c in cs: + if not isinstance(c.origin_type_var, ParamSpecType): + continue + if ( + isinstance(c.target, ParamSpecType) + or isinstance(c.target, Parameters) + and not c.target.imprecise_arg_kinds + ): + have_precise.add(c.type_var) + new_cs = [] + for c in cs: + if not isinstance(c.origin_type_var, ParamSpecType) or c.type_var not in have_precise: + new_cs.append(c) + if not isinstance(c.target, Parameters) or not c.target.imprecise_arg_kinds: + new_cs.append(c) + return new_cs diff --git a/mypy/dmypy/client.py b/mypy/dmypy/client.py index 229740e44db0..9f0751e93609 100644 --- a/mypy/dmypy/client.py +++ b/mypy/dmypy/client.py @@ -573,7 +573,7 @@ def check_output( write_junit_xml( response["roundtrip_time"], bool(err), - messages, + {None: messages} if messages else {}, junit_xml, response["python_version"], response["platform"], diff --git a/mypy/dmypy_server.py b/mypy/dmypy_server.py index 0db349b5bf82..42236497f275 100644 --- a/mypy/dmypy_server.py +++ b/mypy/dmypy_server.py @@ -393,15 +393,21 @@ def cmd_recheck( t1 = time.time() manager = self.fine_grained_manager.manager manager.log(f"fine-grained increment: cmd_recheck: {t1 - t0:.3f}s") - self.options.export_types = export_types + old_export_types = self.options.export_types + self.options.export_types = self.options.export_types or export_types if not self.following_imports(): - messages = self.fine_grained_increment(sources, remove, update) + messages = self.fine_grained_increment( + sources, remove, update, explicit_export_types=export_types + ) else: assert remove is None and update is None - messages = self.fine_grained_increment_follow_imports(sources) + messages = self.fine_grained_increment_follow_imports( + sources, explicit_export_types=export_types + ) res = self.increment_output(messages, sources, is_tty, terminal_width) self.flush_caches() self.update_stats(res) + self.options.export_types = old_export_types return res def check( @@ -412,17 +418,21 @@ def check( If is_tty is True format the output nicely with colors and summary line (unless disabled in self.options). Also pass the terminal_width to formatter. """ - self.options.export_types = export_types + old_export_types = self.options.export_types + self.options.export_types = self.options.export_types or export_types if not self.fine_grained_manager: res = self.initialize_fine_grained(sources, is_tty, terminal_width) else: if not self.following_imports(): - messages = self.fine_grained_increment(sources) + messages = self.fine_grained_increment(sources, explicit_export_types=export_types) else: - messages = self.fine_grained_increment_follow_imports(sources) + messages = self.fine_grained_increment_follow_imports( + sources, explicit_export_types=export_types + ) res = self.increment_output(messages, sources, is_tty, terminal_width) self.flush_caches() self.update_stats(res) + self.options.export_types = old_export_types return res def flush_caches(self) -> None: @@ -535,6 +545,7 @@ def fine_grained_increment( sources: list[BuildSource], remove: list[str] | None = None, update: list[str] | None = None, + explicit_export_types: bool = False, ) -> list[str]: """Perform a fine-grained type checking increment. @@ -545,6 +556,8 @@ def fine_grained_increment( sources: sources passed on the command line remove: paths of files that have been removed update: paths of files that have been changed or created + explicit_export_types: --export-type was passed in a check command + (as opposite to being set in dmypy start) """ assert self.fine_grained_manager is not None manager = self.fine_grained_manager.manager @@ -559,6 +572,10 @@ def fine_grained_increment( # Use the remove/update lists to update fswatcher. # This avoids calling stat() for unchanged files. changed, removed = self.update_changed(sources, remove or [], update or []) + if explicit_export_types: + # If --export-types is given, we need to force full re-checking of all + # explicitly passed files, since we need to visit each expression. + add_all_sources_to_changed(sources, changed) changed += self.find_added_suppressed( self.fine_grained_manager.graph, set(), manager.search_paths ) @@ -577,7 +594,9 @@ def fine_grained_increment( self.previous_sources = sources return messages - def fine_grained_increment_follow_imports(self, sources: list[BuildSource]) -> list[str]: + def fine_grained_increment_follow_imports( + self, sources: list[BuildSource], explicit_export_types: bool = False + ) -> list[str]: """Like fine_grained_increment, but follow imports.""" t0 = time.time() @@ -603,6 +622,9 @@ def fine_grained_increment_follow_imports(self, sources: list[BuildSource]) -> l changed, new_files = self.find_reachable_changed_modules( sources, graph, seen, changed_paths ) + if explicit_export_types: + # Same as in fine_grained_increment(). + add_all_sources_to_changed(sources, changed) sources.extend(new_files) # Process changes directly reachable from roots. @@ -1011,6 +1033,22 @@ def find_all_sources_in_build( return result +def add_all_sources_to_changed(sources: list[BuildSource], changed: list[tuple[str, str]]) -> None: + """Add all (explicit) sources to the list changed files in place. + + Use this when re-processing of unchanged files is needed (e.g. for + the purpose of exporting types for inspections). + """ + changed_set = set(changed) + changed.extend( + [ + (bs.module, bs.path) + for bs in sources + if bs.path and (bs.module, bs.path) not in changed_set + ] + ) + + def fix_module_deps(graph: mypy.build.Graph) -> None: """After an incremental update, update module dependencies to reflect the new state. diff --git a/mypy/errorcodes.py b/mypy/errorcodes.py index 98600679da53..72ee63a6a897 100644 --- a/mypy/errorcodes.py +++ b/mypy/errorcodes.py @@ -255,6 +255,12 @@ def __hash__(self) -> int: "General", default_enabled=False, ) +MUTABLE_OVERRIDE: Final[ErrorCode] = ErrorCode( + "mutable-override", + "Reject covariant overrides for mutable attributes", + "General", + default_enabled=False, +) # Syntax errors are often blocking. @@ -274,3 +280,6 @@ def __hash__(self) -> int: "General", sub_code_of=MISC, ) + +# This copy will not include any error codes defined later in the plugins. +mypy_error_codes = error_codes.copy() diff --git a/mypy/errors.py b/mypy/errors.py index 4e62a48aeb27..6e90c28d9c03 100644 --- a/mypy/errors.py +++ b/mypy/errors.py @@ -8,7 +8,7 @@ from typing_extensions import Literal, TypeAlias as _TypeAlias from mypy import errorcodes as codes -from mypy.errorcodes import IMPORT, IMPORT_NOT_FOUND, IMPORT_UNTYPED, ErrorCode +from mypy.errorcodes import IMPORT, IMPORT_NOT_FOUND, IMPORT_UNTYPED, ErrorCode, mypy_error_codes from mypy.message_registry import ErrorMessage from mypy.options import Options from mypy.scope import Scope @@ -560,6 +560,7 @@ def add_error_info(self, info: ErrorInfo) -> None: and not self.options.hide_error_codes and info.code is not None and info.code not in HIDE_LINK_CODES + and info.code.code in mypy_error_codes ): message = f"See {BASE_RTD_URL}-{info.code.code} for more info" if message in self.only_once_messages: diff --git a/mypy/expandtype.py b/mypy/expandtype.py index cb09a1ee99f5..f6aa74add9d8 100644 --- a/mypy/expandtype.py +++ b/mypy/expandtype.py @@ -253,6 +253,7 @@ def visit_param_spec(self, t: ParamSpecType) -> Type: t.prefix.arg_kinds + repl.arg_kinds, t.prefix.arg_names + repl.arg_names, variables=[*t.prefix.variables, *repl.variables], + imprecise_arg_kinds=repl.imprecise_arg_kinds, ) else: # We could encode Any as trivial parameters etc., but it would be too verbose. @@ -306,18 +307,24 @@ def interpolate_args_for_unpack(self, t: CallableType, var_arg: UnpackType) -> l suffix = self.expand_types(t.arg_types[star_index + 1 :]) var_arg_type = get_proper_type(var_arg.type) - # We have something like Unpack[Tuple[Unpack[Ts], X1, X2]] - if isinstance(var_arg_type, TupleType): - expanded_tuple = var_arg_type.accept(self) - assert isinstance(expanded_tuple, ProperType) and isinstance(expanded_tuple, TupleType) - expanded_items = expanded_tuple.items - fallback = var_arg_type.partial_fallback + if isinstance(var_arg_type, Instance): + # we have something like Unpack[Tuple[Any, ...]] + new_unpack = var_arg else: - # We have plain Unpack[Ts] - assert isinstance(var_arg_type, TypeVarTupleType) - fallback = var_arg_type.tuple_fallback - expanded_items = self.expand_unpack(var_arg) - new_unpack = UnpackType(TupleType(expanded_items, fallback)) + if isinstance(var_arg_type, TupleType): + # We have something like Unpack[Tuple[Unpack[Ts], X1, X2]] + expanded_tuple = var_arg_type.accept(self) + assert isinstance(expanded_tuple, ProperType) and isinstance( + expanded_tuple, TupleType + ) + expanded_items = expanded_tuple.items + fallback = var_arg_type.partial_fallback + else: + # We have plain Unpack[Ts] + assert isinstance(var_arg_type, TypeVarTupleType), type(var_arg_type) + fallback = var_arg_type.tuple_fallback + expanded_items = self.expand_unpack(var_arg) + new_unpack = UnpackType(TupleType(expanded_items, fallback)) return prefix + [new_unpack] + suffix def visit_callable_type(self, t: CallableType) -> CallableType: diff --git a/mypy/fastparse.py b/mypy/fastparse.py index 95d99db84a15..cba01eab2e4e 100644 --- a/mypy/fastparse.py +++ b/mypy/fastparse.py @@ -190,7 +190,7 @@ def parse( source: str | bytes, fnam: str, module: str | None, - errors: Errors | None = None, + errors: Errors, options: Options | None = None, ) -> MypyFile: """Parse a source file, without doing any semantic analysis. @@ -199,16 +199,13 @@ def parse( on failure. Otherwise, use the errors object to report parse errors. """ ignore_errors = (options is not None and options.ignore_errors) or ( - errors is not None and fnam in errors.ignored_files + fnam in errors.ignored_files ) # If errors are ignored, we can drop many function bodies to speed up type checking. strip_function_bodies = ignore_errors and (options is None or not options.preserve_asts) - raise_on_error = False + if options is None: options = Options() - if errors is None: - errors = Errors(options) - raise_on_error = True errors.set_file(fnam, module, options=options) is_stub_file = fnam.endswith(".pyi") if is_stub_file: @@ -228,11 +225,9 @@ def parse( options=options, is_stub=is_stub_file, errors=errors, - ignore_errors=ignore_errors, strip_function_bodies=strip_function_bodies, + path=fnam, ).visit(ast) - tree.path = fnam - tree.is_stub = is_stub_file except SyntaxError as e: # alias to please mypyc is_py38_or_earlier = sys.version_info < (3, 9) @@ -254,9 +249,6 @@ def parse( ) tree = MypyFile([], [], False, {}) - if raise_on_error and errors.is_errors(): - errors.raise_error() - assert isinstance(tree, MypyFile) return tree @@ -357,8 +349,8 @@ def __init__( is_stub: bool, errors: Errors, *, - ignore_errors: bool, strip_function_bodies: bool, + path: str, ) -> None: # 'C' for class, 'D' for function signature, 'F' for function, 'L' for lambda self.class_and_function_stack: list[Literal["C", "D", "F", "L"]] = [] @@ -367,8 +359,8 @@ def __init__( self.options = options self.is_stub = is_stub self.errors = errors - self.ignore_errors = ignore_errors self.strip_function_bodies = strip_function_bodies + self.path = path self.type_ignores: dict[int, list[str]] = {} @@ -380,6 +372,10 @@ def note(self, msg: str, line: int, column: int) -> None: def fail(self, msg: ErrorMessage, line: int, column: int, blocker: bool = True) -> None: if blocker or not self.options.ignore_errors: + # Make sure self.errors reflects any type ignores that we have parsed + self.errors.set_file_ignored_lines( + self.path, self.type_ignores, self.options.ignore_errors + ) self.errors.report(line, column, msg.value, blocker=blocker, code=msg.code) def fail_merge_overload(self, node: IfStmt) -> None: @@ -858,8 +854,13 @@ def visit_Module(self, mod: ast3.Module) -> MypyFile: self.type_ignores[ti.lineno] = parsed else: self.fail(message_registry.INVALID_TYPE_IGNORE, ti.lineno, -1, blocker=False) + body = self.fix_function_overloads(self.translate_stmt_list(mod.body, ismodule=True)) - return MypyFile(body, self.imports, False, self.type_ignores) + + ret = MypyFile(body, self.imports, False, ignored_lines=self.type_ignores) + ret.is_stub = self.is_stub + ret.path = self.path + return ret # --- stmt --- # FunctionDef(identifier name, arguments args, diff --git a/mypy/main.py b/mypy/main.py index 1aede530c33e..8a35c2056963 100644 --- a/mypy/main.py +++ b/mypy/main.py @@ -7,6 +7,7 @@ import subprocess import sys import time +from collections import defaultdict from gettext import gettext from typing import IO, Any, Final, NoReturn, Sequence, TextIO @@ -158,11 +159,14 @@ def run_build( formatter = util.FancyFormatter(stdout, stderr, options.hide_error_codes) messages = [] + messages_by_file = defaultdict(list) - def flush_errors(new_messages: list[str], serious: bool) -> None: + def flush_errors(filename: str | None, new_messages: list[str], serious: bool) -> None: if options.pretty: new_messages = formatter.fit_in_terminal(new_messages) messages.extend(new_messages) + if new_messages: + messages_by_file[filename].extend(new_messages) if options.non_interactive: # Collect messages and possibly show them later. return @@ -200,7 +204,7 @@ def flush_errors(new_messages: list[str], serious: bool) -> None: ), file=stderr, ) - maybe_write_junit_xml(time.time() - t0, serious, messages, options) + maybe_write_junit_xml(time.time() - t0, serious, messages, messages_by_file, options) return res, messages, blockers @@ -1054,6 +1058,12 @@ def add_invertible_flag( other_group = parser.add_argument_group(title="Miscellaneous") other_group.add_argument("--quickstart-file", help=argparse.SUPPRESS) other_group.add_argument("--junit-xml", help="Write junit.xml to the given file") + imports_group.add_argument( + "--junit-format", + choices=["global", "per_file"], + default="global", + help="If --junit-xml is set, specifies format. global: single test with all errors; per_file: one test entry per file with failures", + ) other_group.add_argument( "--find-occurrences", metavar="CLASS.MEMBER", @@ -1483,18 +1493,37 @@ def process_cache_map( options.cache_map[source] = (meta_file, data_file) -def maybe_write_junit_xml(td: float, serious: bool, messages: list[str], options: Options) -> None: +def maybe_write_junit_xml( + td: float, + serious: bool, + all_messages: list[str], + messages_by_file: dict[str | None, list[str]], + options: Options, +) -> None: if options.junit_xml: py_version = f"{options.python_version[0]}_{options.python_version[1]}" - util.write_junit_xml( - td, serious, messages, options.junit_xml, py_version, options.platform - ) + if options.junit_format == "global": + util.write_junit_xml( + td, + serious, + {None: all_messages} if all_messages else {}, + options.junit_xml, + py_version, + options.platform, + ) + else: + # per_file + util.write_junit_xml( + td, serious, messages_by_file, options.junit_xml, py_version, options.platform + ) def fail(msg: str, stderr: TextIO, options: Options) -> NoReturn: """Fail with a serious error.""" stderr.write(f"{msg}\n") - maybe_write_junit_xml(0.0, serious=True, messages=[msg], options=options) + maybe_write_junit_xml( + 0.0, serious=True, all_messages=[msg], messages_by_file={None: [msg]}, options=options + ) sys.exit(2) diff --git a/mypy/meet.py b/mypy/meet.py index d2fb16808425..df8b960cdf3f 100644 --- a/mypy/meet.py +++ b/mypy/meet.py @@ -16,6 +16,7 @@ from mypy.typeops import is_recursive_pair, make_simplified_union, tuple_fallback from mypy.types import ( MYPYC_NATIVE_INT_NAMES, + TUPLE_LIKE_INSTANCE_NAMES, AnyType, CallableType, DeletedType, @@ -261,6 +262,7 @@ def is_overlapping_types( ignore_promotions: bool = False, prohibit_none_typevar_overlap: bool = False, ignore_uninhabited: bool = False, + seen_types: set[tuple[Type, Type]] | None = None, ) -> bool: """Can a value of type 'left' also be of type 'right' or vice-versa? @@ -274,18 +276,27 @@ def is_overlapping_types( # A type guard forces the new type even if it doesn't overlap the old. return True + if seen_types is None: + seen_types = set() + if (left, right) in seen_types: + return True + if isinstance(left, TypeAliasType) and isinstance(right, TypeAliasType): + seen_types.add((left, right)) + left, right = get_proper_types((left, right)) def _is_overlapping_types(left: Type, right: Type) -> bool: """Encode the kind of overlapping check to perform. - This function mostly exists so we don't have to repeat keyword arguments everywhere.""" + This function mostly exists, so we don't have to repeat keyword arguments everywhere. + """ return is_overlapping_types( left, right, ignore_promotions=ignore_promotions, prohibit_none_typevar_overlap=prohibit_none_typevar_overlap, ignore_uninhabited=ignore_uninhabited, + seen_types=seen_types.copy(), ) # We should never encounter this type. @@ -936,7 +947,7 @@ def visit_tuple_type(self, t: TupleType) -> ProperType: return TupleType(items, tuple_fallback(t)) elif isinstance(self.s, Instance): # meet(Tuple[t1, t2, <...>], Tuple[s, ...]) == Tuple[meet(t1, s), meet(t2, s), <...>]. - if self.s.type.fullname == "builtins.tuple" and self.s.args: + if self.s.type.fullname in TUPLE_LIKE_INSTANCE_NAMES and self.s.args: return t.copy_modified(items=[meet_types(it, self.s.args[0]) for it in t.items]) elif is_proper_subtype(t, self.s): # A named tuple that inherits from a normal class diff --git a/mypy/message_registry.py b/mypy/message_registry.py index dc46eb503390..8dc14e158d90 100644 --- a/mypy/message_registry.py +++ b/mypy/message_registry.py @@ -63,6 +63,9 @@ def with_additional_msg(self, info: str) -> ErrorMessage: INCOMPATIBLE_TYPES_IN_ASSIGNMENT: Final = ErrorMessage( "Incompatible types in assignment", code=codes.ASSIGNMENT ) +COVARIANT_OVERRIDE_OF_MUTABLE_ATTRIBUTE: Final = ErrorMessage( + "Covariant override of a mutable attribute", code=codes.MUTABLE_OVERRIDE +) INCOMPATIBLE_TYPES_IN_AWAIT: Final = ErrorMessage('Incompatible types in "await"') INCOMPATIBLE_REDEFINITION: Final = ErrorMessage("Incompatible redefinition") INCOMPATIBLE_TYPES_IN_ASYNC_WITH_AENTER: Final = ( @@ -206,10 +209,10 @@ def with_additional_msg(self, info: str) -> ErrorMessage: ) TARGET_CLASS_HAS_NO_BASE_CLASS: Final = ErrorMessage("Target class has no base class") SUPER_OUTSIDE_OF_METHOD_NOT_SUPPORTED: Final = ErrorMessage( - "super() outside of a method is not supported" + '"super()" outside of a method is not supported' ) SUPER_ENCLOSING_POSITIONAL_ARGS_REQUIRED: Final = ErrorMessage( - "super() requires one or more positional arguments in enclosing function" + '"super()" requires one or two positional arguments in enclosing function' ) # Self-type diff --git a/mypy/messages.py b/mypy/messages.py index 19aafedd5586..069c4d51e281 100644 --- a/mypy/messages.py +++ b/mypy/messages.py @@ -991,10 +991,17 @@ def maybe_note_about_special_args(self, callee: CallableType, context: Context) context, ) + def unexpected_keyword_argument_for_function( + self, for_func: str, name: str, context: Context, *, matches: list[str] | None = None + ) -> None: + msg = f'Unexpected keyword argument "{name}"' + for_func + if matches: + msg += f"; did you mean {pretty_seq(matches, 'or')}?" + self.fail(msg, context, code=codes.CALL_ARG) + def unexpected_keyword_argument( self, callee: CallableType, name: str, arg_type: Type, context: Context ) -> None: - msg = f'Unexpected keyword argument "{name}"' + for_function(callee) # Suggest intended keyword, look for type match else fallback on any match. matching_type_args = [] not_matching_type_args = [] @@ -1008,9 +1015,9 @@ def unexpected_keyword_argument( matches = best_matches(name, matching_type_args, n=3) if not matches: matches = best_matches(name, not_matching_type_args, n=3) - if matches: - msg += f"; did you mean {pretty_seq(matches, 'or')}?" - self.fail(msg, context, code=codes.CALL_ARG) + self.unexpected_keyword_argument_for_function( + for_function(callee), name, context, matches=matches + ) module = find_defining_module(self.modules, callee) if module: assert callee.definition is not None @@ -2044,7 +2051,7 @@ def redundant_expr(self, description: str, truthiness: bool, context: Context) - def impossible_intersection( self, formatted_base_class_list: str, reason: str, context: Context ) -> None: - template = "Subclass of {} cannot exist: would have {}" + template = "Subclass of {} cannot exist: {}" self.fail( template.format(formatted_base_class_list, reason), context, code=codes.UNREACHABLE ) diff --git a/mypy/moduleinspect.py b/mypy/moduleinspect.py index 580b31fb4107..35db2132f66c 100644 --- a/mypy/moduleinspect.py +++ b/mypy/moduleinspect.py @@ -8,7 +8,7 @@ import pkgutil import queue import sys -from multiprocessing import Process, Queue +from multiprocessing import Queue, get_context from types import ModuleType @@ -123,9 +123,13 @@ def __init__(self) -> None: self._start() def _start(self) -> None: - self.tasks: Queue[str] = Queue() - self.results: Queue[ModuleProperties | str] = Queue() - self.proc = Process(target=worker, args=(self.tasks, self.results, sys.path)) + if sys.platform == "linux": + ctx = get_context("forkserver") + else: + ctx = get_context("spawn") + self.tasks: Queue[str] = ctx.Queue() + self.results: Queue[ModuleProperties | str] = ctx.Queue() + self.proc = ctx.Process(target=worker, args=(self.tasks, self.results, sys.path)) self.proc.start() self.counter = 0 # Number of successful roundtrips diff --git a/mypy/nodes.py b/mypy/nodes.py index d65a23a6b7fe..17e06613d1e3 100644 --- a/mypy/nodes.py +++ b/mypy/nodes.py @@ -513,6 +513,7 @@ class FuncBase(Node): "is_static", # Uses "@staticmethod" (explicit or implicit) "is_final", # Uses "@final" "is_explicit_override", # Uses "@override" + "is_type_check_only", # Uses "@type_check_only" "_fullname", ) @@ -530,6 +531,7 @@ def __init__(self) -> None: self.is_static = False self.is_final = False self.is_explicit_override = False + self.is_type_check_only = False # Name with module prefix self._fullname = "" @@ -2866,6 +2868,7 @@ class is generic then it will be a type constructor of higher kind. "type_var_tuple_suffix", "self_type", "dataclass_transform_spec", + "is_type_check_only", ) _fullname: str # Fully qualified name @@ -3016,6 +3019,9 @@ class is generic then it will be a type constructor of higher kind. # Added if the corresponding class is directly decorated with `typing.dataclass_transform` dataclass_transform_spec: DataclassTransformSpec | None + # Is set to `True` when class is decorated with `@typing.type_check_only` + is_type_check_only: bool + FLAGS: Final = [ "is_abstract", "is_enum", @@ -3072,6 +3078,7 @@ def __init__(self, names: SymbolTable, defn: ClassDef, module_name: str) -> None self.metadata = {} self.self_type = None self.dataclass_transform_spec = None + self.is_type_check_only = False def add_type_vars(self) -> None: self.has_type_var_tuple_type = False diff --git a/mypy/options.py b/mypy/options.py index 8bb20dbd4410..38a87e423766 100644 --- a/mypy/options.py +++ b/mypy/options.py @@ -255,6 +255,8 @@ def __init__(self) -> None: # Write junit.xml to given file self.junit_xml: str | None = None + self.junit_format: str = "global" # global|per_file + # Caching and incremental checking options self.incremental = True self.cache_dir = defaults.CACHE_DIR diff --git a/mypy/parse.py b/mypy/parse.py index 8bf9983967ba..ee61760c0ac0 100644 --- a/mypy/parse.py +++ b/mypy/parse.py @@ -6,7 +6,12 @@ def parse( - source: str | bytes, fnam: str, module: str | None, errors: Errors | None, options: Options + source: str | bytes, + fnam: str, + module: str | None, + errors: Errors, + options: Options, + raise_on_error: bool = False, ) -> MypyFile: """Parse a source file, without doing any semantic analysis. @@ -19,4 +24,7 @@ def parse( source = options.transform_source(source) import mypy.fastparse - return mypy.fastparse.parse(source, fnam=fnam, module=module, errors=errors, options=options) + tree = mypy.fastparse.parse(source, fnam=fnam, module=module, errors=errors, options=options) + if raise_on_error and errors.is_errors(): + errors.raise_error() + return tree diff --git a/mypy/semanal.py b/mypy/semanal.py index 6f322af816ea..4128369ace5d 100644 --- a/mypy/semanal.py +++ b/mypy/semanal.py @@ -251,6 +251,7 @@ REVEAL_TYPE_NAMES, TPDICT_NAMES, TYPE_ALIAS_NAMES, + TYPE_CHECK_ONLY_NAMES, TYPED_NAMEDTUPLE_NAMES, AnyType, CallableType, @@ -774,7 +775,7 @@ def file_context( self.globals = file_node.names self.tvar_scope = TypeVarLikeScope() - self.named_tuple_analyzer = NamedTupleAnalyzer(options, self) + self.named_tuple_analyzer = NamedTupleAnalyzer(options, self, self.msg) self.typed_dict_analyzer = TypedDictAnalyzer(options, self, self.msg) self.enum_call_analyzer = EnumCallAnalyzer(options, self) self.newtype_analyzer = NewTypeAnalyzer(options, self, self.msg) @@ -1568,6 +1569,9 @@ def visit_decorator(self, dec: Decorator) -> None: removed.append(i) else: self.fail("@final cannot be used with non-method functions", d) + elif refers_to_fullname(d, TYPE_CHECK_ONLY_NAMES): + # TODO: support `@overload` funcs. + dec.func.is_type_check_only = True elif isinstance(d, CallExpr) and refers_to_fullname( d.callee, DATACLASS_TRANSFORM_NAMES ): @@ -1739,9 +1743,8 @@ def analyze_typeddict_classdef(self, defn: ClassDef) -> bool: if is_typeddict: for decorator in defn.decorators: decorator.accept(self) - if isinstance(decorator, RefExpr): - if decorator.fullname in FINAL_DECORATOR_NAMES and info is not None: - info.is_final = True + if info is not None: + self.analyze_class_decorator_common(defn, info, decorator) if info is None: self.mark_incomplete(defn.name, defn) else: @@ -1777,8 +1780,7 @@ def analyze_namedtuple_classdef( with self.scope.class_scope(defn.info): for deco in defn.decorators: deco.accept(self) - if isinstance(deco, RefExpr) and deco.fullname in FINAL_DECORATOR_NAMES: - info.is_final = True + self.analyze_class_decorator_common(defn, defn.info, deco) with self.named_tuple_analyzer.save_namedtuple_body(info): self.analyze_class_body_common(defn) return True @@ -1860,19 +1862,30 @@ def leave_class(self) -> None: def analyze_class_decorator(self, defn: ClassDef, decorator: Expression) -> None: decorator.accept(self) + self.analyze_class_decorator_common(defn, defn.info, decorator) if isinstance(decorator, RefExpr): if decorator.fullname in RUNTIME_PROTOCOL_DECOS: if defn.info.is_protocol: defn.info.runtime_protocol = True else: self.fail("@runtime_checkable can only be used with protocol classes", defn) - elif decorator.fullname in FINAL_DECORATOR_NAMES: - defn.info.is_final = True elif isinstance(decorator, CallExpr) and refers_to_fullname( decorator.callee, DATACLASS_TRANSFORM_NAMES ): defn.info.dataclass_transform_spec = self.parse_dataclass_transform_spec(decorator) + def analyze_class_decorator_common( + self, defn: ClassDef, info: TypeInfo, decorator: Expression + ) -> None: + """Common method for applying class decorators. + + Called on regular classes, typeddicts, and namedtuples. + """ + if refers_to_fullname(decorator, FINAL_DECORATOR_NAMES): + info.is_final = True + elif refers_to_fullname(decorator, TYPE_CHECK_ONLY_NAMES): + info.is_type_check_only = True + def clean_up_bases_and_infer_type_variables( self, defn: ClassDef, base_type_exprs: list[Expression], context: Context ) -> tuple[list[Expression], list[TypeVarLikeType], bool]: @@ -2842,22 +2855,23 @@ def visit_assignment_stmt(self, s: AssignmentStmt) -> None: if self.check_and_set_up_type_alias(s): s.is_alias_def = True special_form = True - # * type variable definition - elif self.process_typevar_declaration(s): - special_form = True - elif self.process_paramspec_declaration(s): - special_form = True - elif self.process_typevartuple_declaration(s): - special_form = True - # * type constructors - elif self.analyze_namedtuple_assign(s): - special_form = True - elif self.analyze_typeddict_assign(s): - special_form = True - elif self.newtype_analyzer.process_newtype_declaration(s): - special_form = True - elif self.analyze_enum_assign(s): - special_form = True + elif isinstance(s.rvalue, CallExpr): + # * type variable definition + if self.process_typevar_declaration(s): + special_form = True + elif self.process_paramspec_declaration(s): + special_form = True + elif self.process_typevartuple_declaration(s): + special_form = True + # * type constructors + elif self.analyze_namedtuple_assign(s): + special_form = True + elif self.analyze_typeddict_assign(s): + special_form = True + elif self.newtype_analyzer.process_newtype_declaration(s): + special_form = True + elif self.analyze_enum_assign(s): + special_form = True if special_form: self.record_special_form_lvalue(s) diff --git a/mypy/semanal_enum.py b/mypy/semanal_enum.py index cd11204c3bcc..528b0519cca1 100644 --- a/mypy/semanal_enum.py +++ b/mypy/semanal_enum.py @@ -106,16 +106,19 @@ class A(enum.Enum): items, values, ok = self.parse_enum_call_args(call, fullname.split(".")[-1]) if not ok: # Error. Construct dummy return value. - info = self.build_enum_call_typeinfo(var_name, [], fullname, node.line) + name = var_name + if is_func_scope: + name += "@" + str(call.line) + info = self.build_enum_call_typeinfo(name, [], fullname, node.line) else: name = cast(StrExpr, call.args[0]).value if name != var_name or is_func_scope: # Give it a unique name derived from the line number. name += "@" + str(call.line) info = self.build_enum_call_typeinfo(name, items, fullname, call.line) - # Store generated TypeInfo under both names, see semanal_namedtuple for more details. - if name != var_name or is_func_scope: - self.api.add_symbol_skip_local(name, info) + # Store generated TypeInfo under both names, see semanal_namedtuple for more details. + if name != var_name or is_func_scope: + self.api.add_symbol_skip_local(name, info) call.analyzed = EnumCallExpr(info, items, values) call.analyzed.set_line(call) info.line = node.line diff --git a/mypy/semanal_namedtuple.py b/mypy/semanal_namedtuple.py index 80cf1c4e184a..bc3c5dd61894 100644 --- a/mypy/semanal_namedtuple.py +++ b/mypy/semanal_namedtuple.py @@ -9,6 +9,7 @@ from typing import Final, Iterator, List, Mapping, cast from mypy.exprtotype import TypeTranslationError, expr_to_unanalyzed_type +from mypy.messages import MessageBuilder from mypy.nodes import ( ARG_NAMED_OPT, ARG_OPT, @@ -91,9 +92,12 @@ class NamedTupleAnalyzer: - def __init__(self, options: Options, api: SemanticAnalyzerInterface) -> None: + def __init__( + self, options: Options, api: SemanticAnalyzerInterface, msg: MessageBuilder + ) -> None: self.options = options self.api = api + self.msg = msg def analyze_namedtuple_classdef( self, defn: ClassDef, is_stub_file: bool, is_func_scope: bool @@ -204,6 +208,10 @@ def check_namedtuple_classdef( ) else: default_items[name] = stmt.rvalue + if defn.keywords: + for_function = ' for "__init_subclass__" of "NamedTuple"' + for key in defn.keywords: + self.msg.unexpected_keyword_argument_for_function(for_function, key, defn) return items, types, default_items, statements def check_namedtuple( diff --git a/mypy/semanal_typeddict.py b/mypy/semanal_typeddict.py index e9aaee55879a..13aab4de65e4 100644 --- a/mypy/semanal_typeddict.py +++ b/mypy/semanal_typeddict.py @@ -37,6 +37,7 @@ has_placeholder, require_bool_literal_argument, ) +from mypy.state import state from mypy.typeanal import check_for_explicit_any, has_any_from_unimported_type from mypy.types import ( TPDICT_NAMES, @@ -203,7 +204,8 @@ def add_keys_and_types_from_base( any_kind = TypeOfAny.from_error base_args = [AnyType(any_kind) for _ in tvars] - valid_items = self.map_items_to_base(valid_items, tvars, base_args) + with state.strict_optional_set(self.options.strict_optional): + valid_items = self.map_items_to_base(valid_items, tvars, base_args) for key in base_items: if key in keys: self.fail(f'Overwriting TypedDict field "{key}" while merging', ctx) @@ -321,6 +323,12 @@ def analyze_typeddict_classdef_fields( total: bool | None = True if "total" in defn.keywords: total = require_bool_literal_argument(self.api, defn.keywords["total"], "total", True) + if defn.keywords and defn.keywords.keys() != {"total"}: + for_function = ' for "__init_subclass__" of "TypedDict"' + for key in defn.keywords: + if key == "total": + continue + self.msg.unexpected_keyword_argument_for_function(for_function, key, defn) required_keys = { field for (field, t) in zip(fields, types) @@ -392,6 +400,17 @@ def check_typeddict( types = [ # unwrap Required[T] to just T t.item if isinstance(t, RequiredType) else t for t in types ] + + # Perform various validations after unwrapping. + for t in types: + check_for_explicit_any( + t, self.options, self.api.is_typeshed_stub_file, self.msg, context=call + ) + if self.options.disallow_any_unimported: + for t in types: + if has_any_from_unimported_type(t): + self.msg.unimported_type_becomes_any("Type of a TypedDict key", t, call) + existing_info = None if isinstance(node.analyzed, TypedDictExpr): existing_info = node.analyzed.info @@ -449,15 +468,6 @@ def parse_typeddict_args( # One of the types is not ready, defer. return None items, types, ok = res - for t in types: - check_for_explicit_any( - t, self.options, self.api.is_typeshed_stub_file, self.msg, context=call - ) - - if self.options.disallow_any_unimported: - for t in types: - if has_any_from_unimported_type(t): - self.msg.unimported_type_becomes_any("Type of a TypedDict key", t, dictexpr) assert total is not None return args[0].value, items, types, total, tvar_defs, ok diff --git a/mypy/solve.py b/mypy/solve.py index efe8e487c506..9770364bf892 100644 --- a/mypy/solve.py +++ b/mypy/solve.py @@ -6,7 +6,7 @@ from typing import Iterable, Sequence from typing_extensions import TypeAlias as _TypeAlias -from mypy.constraints import SUBTYPE_OF, SUPERTYPE_OF, Constraint, infer_constraints +from mypy.constraints import SUBTYPE_OF, SUPERTYPE_OF, Constraint, infer_constraints, neg_op from mypy.expandtype import expand_type from mypy.graph_utils import prepare_sccs, strongly_connected_components, topsort from mypy.join import join_types @@ -69,6 +69,10 @@ def solve_constraints( extra_vars.extend([v.id for v in c.extra_tvars if v.id not in vars + extra_vars]) originals.update({v.id: v for v in c.extra_tvars if v.id not in originals}) + if allow_polymorphic: + # Constraints inferred from unions require special handling in polymorphic inference. + constraints = skip_reverse_union_constraints(constraints) + # Collect a list of constraints for each type variable. cmap: dict[TypeVarId, list[Constraint]] = {tv: [] for tv in vars + extra_vars} for con in constraints: @@ -431,10 +435,7 @@ def transitive_closure( uppers[l] |= uppers[upper] for lt in lowers[lower]: for ut in uppers[upper]: - # TODO: what if secondary constraints result in inference - # against polymorphic actual (also in below branches)? - remaining |= set(infer_constraints(lt, ut, SUBTYPE_OF)) - remaining |= set(infer_constraints(ut, lt, SUPERTYPE_OF)) + add_secondary_constraints(remaining, lt, ut) elif c.op == SUBTYPE_OF: if c.target in uppers[c.type_var]: continue @@ -442,8 +443,7 @@ def transitive_closure( if (l, c.type_var) in graph: uppers[l].add(c.target) for lt in lowers[c.type_var]: - remaining |= set(infer_constraints(lt, c.target, SUBTYPE_OF)) - remaining |= set(infer_constraints(c.target, lt, SUPERTYPE_OF)) + add_secondary_constraints(remaining, lt, c.target) else: assert c.op == SUPERTYPE_OF if c.target in lowers[c.type_var]: @@ -452,11 +452,24 @@ def transitive_closure( if (c.type_var, u) in graph: lowers[u].add(c.target) for ut in uppers[c.type_var]: - remaining |= set(infer_constraints(ut, c.target, SUPERTYPE_OF)) - remaining |= set(infer_constraints(c.target, ut, SUBTYPE_OF)) + add_secondary_constraints(remaining, c.target, ut) return graph, lowers, uppers +def add_secondary_constraints(cs: set[Constraint], lower: Type, upper: Type) -> None: + """Add secondary constraints inferred between lower and upper (in place).""" + if isinstance(get_proper_type(upper), UnionType) and isinstance( + get_proper_type(lower), UnionType + ): + # When both types are unions, this can lead to inferring spurious constraints, + # for example Union[T, int] <: S <: Union[T, int] may infer T <: int. + # To avoid this, just skip them for now. + return + # TODO: what if secondary constraints result in inference against polymorphic actual? + cs.update(set(infer_constraints(lower, upper, SUBTYPE_OF))) + cs.update(set(infer_constraints(upper, lower, SUPERTYPE_OF))) + + def compute_dependencies( tvars: list[TypeVarId], graph: Graph, lowers: Bounds, uppers: Bounds ) -> dict[TypeVarId, list[TypeVarId]]: @@ -494,6 +507,28 @@ def check_linear(scc: set[TypeVarId], lowers: Bounds, uppers: Bounds) -> bool: return True +def skip_reverse_union_constraints(cs: list[Constraint]) -> list[Constraint]: + """Avoid ambiguities for constraints inferred from unions during polymorphic inference. + + Polymorphic inference implicitly relies on assumption that a reverse of a linear constraint + is a linear constraint. This is however not true in presence of union types, for example + T :> Union[S, int] vs S <: T. Trying to solve such constraints would be detected ambiguous + as (T, S) form a non-linear SCC. However, simply removing the linear part results in a valid + solution T = Union[S, int], S = . + + TODO: a cleaner solution may be to avoid inferring such constraints in first place, but + this would require passing around a flag through all infer_constraints() calls. + """ + reverse_union_cs = set() + for c in cs: + p_target = get_proper_type(c.target) + if isinstance(p_target, UnionType): + for item in p_target.items: + if isinstance(item, TypeVarType): + reverse_union_cs.add(Constraint(item, neg_op(c.op), c.origin_type_var)) + return [c for c in cs if c not in reverse_union_cs] + + def get_vars(target: Type, vars: list[TypeVarId]) -> set[TypeVarId]: """Find type variables for which we are solving in a target type.""" return {tv.id for tv in get_all_type_vars(target)} & set(vars) diff --git a/mypy/stubdoc.py b/mypy/stubdoc.py index c277573f0b59..86ff6e2bb540 100644 --- a/mypy/stubdoc.py +++ b/mypy/stubdoc.py @@ -36,11 +36,19 @@ def is_valid_type(s: str) -> bool: class ArgSig: """Signature info for a single argument.""" - def __init__(self, name: str, type: str | None = None, default: bool = False): + def __init__( + self, + name: str, + type: str | None = None, + *, + default: bool = False, + default_value: str = "...", + ) -> None: self.name = name self.type = type # Does this argument have a default value? self.default = default + self.default_value = default_value def is_star_arg(self) -> bool: return self.name.startswith("*") and not self.name.startswith("**") @@ -59,6 +67,7 @@ def __eq__(self, other: Any) -> bool: self.name == other.name and self.type == other.type and self.default == other.default + and self.default_value == other.default_value ) return False @@ -119,10 +128,10 @@ def format_sig( if arg_type: arg_def += ": " + arg_type if arg.default: - arg_def += " = ..." + arg_def += f" = {arg.default_value}" elif arg.default: - arg_def += "=..." + arg_def += f"={arg.default_value}" args.append(arg_def) @@ -374,7 +383,8 @@ def infer_ret_type_sig_from_docstring(docstr: str, name: str) -> str | None: def infer_ret_type_sig_from_anon_docstring(docstr: str) -> str | None: """Convert signature in form of "(self: TestClass, arg0) -> int" to their return type.""" - return infer_ret_type_sig_from_docstring("stub" + docstr.strip(), "stub") + lines = ["stub" + line.strip() for line in docstr.splitlines() if line.strip().startswith("(")] + return infer_ret_type_sig_from_docstring("".join(lines), "stub") def parse_signature(sig: str) -> tuple[str, list[str], list[str]] | None: diff --git a/mypy/stubgen.py b/mypy/stubgen.py index 837cd723c410..23b5fde9dff2 100755 --- a/mypy/stubgen.py +++ b/mypy/stubgen.py @@ -99,6 +99,7 @@ NameExpr, OpExpr, OverloadedFuncDef, + SetExpr, Statement, StrExpr, TempNode, @@ -491,15 +492,21 @@ def _get_func_args(self, o: FuncDef, ctx: FunctionContext) -> list[ArgSig]: if kind.is_named() and not any(arg.name.startswith("*") for arg in args): args.append(ArgSig("*")) + default = "..." if arg_.initializer: if not typename: typename = self.get_str_type_of_node(arg_.initializer, True, False) + potential_default, valid = self.get_str_default_of_node(arg_.initializer) + if valid and len(potential_default) <= 200: + default = potential_default elif kind == ARG_STAR: name = f"*{name}" elif kind == ARG_STAR2: name = f"**{name}" - args.append(ArgSig(name, typename, default=bool(arg_.initializer))) + args.append( + ArgSig(name, typename, default=bool(arg_.initializer), default_value=default) + ) if ctx.class_info is not None and all( arg.type is None and arg.default is False for arg in args @@ -1234,6 +1241,70 @@ def maybe_unwrap_unary_expr(self, expr: Expression) -> Expression: # This is some other unary expr, we cannot do anything with it (yet?). return expr + def get_str_default_of_node(self, rvalue: Expression) -> tuple[str, bool]: + """Get a string representation of the default value of a node. + + Returns a 2-tuple of the default and whether or not it is valid. + """ + if isinstance(rvalue, NameExpr): + if rvalue.name in ("None", "True", "False"): + return rvalue.name, True + elif isinstance(rvalue, (IntExpr, FloatExpr)): + return f"{rvalue.value}", True + elif isinstance(rvalue, UnaryExpr): + if isinstance(rvalue.expr, (IntExpr, FloatExpr)): + return f"{rvalue.op}{rvalue.expr.value}", True + elif isinstance(rvalue, StrExpr): + return repr(rvalue.value), True + elif isinstance(rvalue, BytesExpr): + return "b" + repr(rvalue.value).replace("\\\\", "\\"), True + elif isinstance(rvalue, TupleExpr): + items_defaults = [] + for e in rvalue.items: + e_default, valid = self.get_str_default_of_node(e) + if not valid: + break + items_defaults.append(e_default) + else: + closing = ",)" if len(items_defaults) == 1 else ")" + default = "(" + ", ".join(items_defaults) + closing + return default, True + elif isinstance(rvalue, ListExpr): + items_defaults = [] + for e in rvalue.items: + e_default, valid = self.get_str_default_of_node(e) + if not valid: + break + items_defaults.append(e_default) + else: + default = "[" + ", ".join(items_defaults) + "]" + return default, True + elif isinstance(rvalue, SetExpr): + items_defaults = [] + for e in rvalue.items: + e_default, valid = self.get_str_default_of_node(e) + if not valid: + break + items_defaults.append(e_default) + else: + if items_defaults: + default = "{" + ", ".join(items_defaults) + "}" + return default, True + elif isinstance(rvalue, DictExpr): + items_defaults = [] + for k, v in rvalue.items: + if k is None: + break + k_default, k_valid = self.get_str_default_of_node(k) + v_default, v_valid = self.get_str_default_of_node(v) + if not (k_valid and v_valid): + break + items_defaults.append(f"{k_default}: {v_default}") + else: + default = "{" + ", ".join(items_defaults) + "}" + return default, True + return "...", False + def should_reexport(self, name: str, full_module: str, name_is_alias: bool) -> bool: is_private = self.is_private_name(name, full_module + "." + name) if ( @@ -1629,6 +1700,7 @@ def generate_stubs(options: Options) -> None: doc_dir=options.doc_dir, include_private=options.include_private, export_less=options.export_less, + include_docstrings=options.include_docstrings, ) num_modules = len(all_modules) if not options.quiet and num_modules > 0: diff --git a/mypy/stubgenc.py b/mypy/stubgenc.py index 0ad79a4265b3..39288197f477 100755 --- a/mypy/stubgenc.py +++ b/mypy/stubgenc.py @@ -126,10 +126,12 @@ def get_property_type(self, default_type: str | None, ctx: FunctionContext) -> s """Infer property type from docstring or docstring signature.""" if ctx.docstring is not None: inferred = infer_ret_type_sig_from_anon_docstring(ctx.docstring) - if not inferred: - inferred = infer_ret_type_sig_from_docstring(ctx.docstring, ctx.name) - if not inferred: - inferred = infer_prop_type_from_docstring(ctx.docstring) + if inferred: + return inferred + inferred = infer_ret_type_sig_from_docstring(ctx.docstring, ctx.name) + if inferred: + return inferred + inferred = infer_prop_type_from_docstring(ctx.docstring) return inferred else: return None @@ -237,6 +239,26 @@ def __init__( self.resort_members = self.is_c_module super().__init__(_all_, include_private, export_less, include_docstrings) self.module_name = module_name + if self.is_c_module: + # Add additional implicit imports. + # C-extensions are given more lattitude since they do not import the typing module. + self.known_imports.update( + { + "typing": [ + "Any", + "Callable", + "ClassVar", + "Dict", + "Iterable", + "Iterator", + "List", + "NamedTuple", + "Optional", + "Tuple", + "Union", + ] + } + ) def get_default_function_sig(self, func: object, ctx: FunctionContext) -> FunctionSig: argspec = None @@ -590,9 +612,29 @@ def generate_function_stub( if inferred[0].args and inferred[0].args[0].name == "cls": decorators.append("@classmethod") + if docstring: + docstring = self._indent_docstring(docstring) output.extend(self.format_func_def(inferred, decorators=decorators, docstring=docstring)) self._fix_iter(ctx, inferred, output) + def _indent_docstring(self, docstring: str) -> str: + """Fix indentation of docstring extracted from pybind11 or other binding generators.""" + lines = docstring.splitlines(keepends=True) + indent = self._indent + " " + if len(lines) > 1: + if not all(line.startswith(indent) or not line.strip() for line in lines): + # if the docstring is not indented, then indent all but the first line + for i, line in enumerate(lines[1:]): + if line.strip(): + lines[i + 1] = indent + line + # if there's a trailing newline, add a final line to visually indent the quoted docstring + if lines[-1].endswith("\n"): + if len(lines) > 1: + lines.append(indent) + else: + lines[-1] = lines[-1][:-1] + return "".join(lines) + def _fix_iter( self, ctx: FunctionContext, inferred: list[FunctionSig], output: list[str] ) -> None: @@ -640,7 +682,7 @@ def generate_property_stub( if fget: alt_docstr = getattr(fget, "__doc__", None) if alt_docstr and docstring: - docstring += alt_docstr + docstring += "\n" + alt_docstr elif alt_docstr: docstring = alt_docstr diff --git a/mypy/stubtest.py b/mypy/stubtest.py index e80ea4eac71f..c02a3efd8dc0 100644 --- a/mypy/stubtest.py +++ b/mypy/stubtest.py @@ -55,6 +55,17 @@ def __repr__(self) -> str: T = TypeVar("T") MaybeMissing: typing_extensions.TypeAlias = Union[T, Missing] + +class Unrepresentable: + """Marker object for unrepresentable parameter defaults.""" + + def __repr__(self) -> str: + return "" + + +UNREPRESENTABLE: typing_extensions.Final = Unrepresentable() + + _formatter: typing_extensions.Final = FancyFormatter(sys.stdout, sys.stderr, False) @@ -102,7 +113,17 @@ def __init__( self.stub_object = stub_object self.runtime_object = runtime_object self.stub_desc = stub_desc or str(getattr(stub_object, "type", stub_object)) - self.runtime_desc = runtime_desc or _truncate(repr(runtime_object), 100) + + if runtime_desc is None: + runtime_sig = safe_inspect_signature(runtime_object) + if runtime_sig is None: + self.runtime_desc = _truncate(repr(runtime_object), 100) + else: + runtime_is_async = inspect.iscoroutinefunction(runtime_object) + description = describe_runtime_callable(runtime_sig, is_async=runtime_is_async) + self.runtime_desc = _truncate(description, 100) + else: + self.runtime_desc = runtime_desc def is_missing_stub(self) -> bool: """Whether or not the error is for something missing from the stub.""" @@ -484,6 +505,19 @@ def _verify_metaclass( def verify_typeinfo( stub: nodes.TypeInfo, runtime: MaybeMissing[type[Any]], object_path: list[str] ) -> Iterator[Error]: + if stub.is_type_check_only: + # This type only exists in stubs, we only check that the runtime part + # is missing. Other checks are not required. + if not isinstance(runtime, Missing): + yield Error( + object_path, + 'is marked as "@type_check_only", but also exists at runtime', + stub, + runtime, + stub_desc=repr(stub), + ) + return + if isinstance(runtime, Missing): yield Error(object_path, "is not present at runtime", stub, runtime, stub_desc=repr(stub)) return @@ -658,6 +692,7 @@ def _verify_arg_default_value( if ( stub_default is not UNKNOWN and stub_default is not ... + and runtime_arg.default is not UNREPRESENTABLE and ( stub_default != runtime_arg.default # We want the types to match exactly, e.g. in case the stub has @@ -987,7 +1022,7 @@ def verify_funcitem( if signature: stub_sig = Signature.from_funcitem(stub) runtime_sig = Signature.from_inspect_signature(signature) - runtime_sig_desc = f'{"async " if runtime_is_coroutine else ""}def {signature}' + runtime_sig_desc = describe_runtime_callable(signature, is_async=runtime_is_coroutine) stub_desc = str(stub_sig) else: runtime_sig_desc, stub_desc = None, None @@ -1066,6 +1101,7 @@ def verify_var( def verify_overloadedfuncdef( stub: nodes.OverloadedFuncDef, runtime: MaybeMissing[Any], object_path: list[str] ) -> Iterator[Error]: + # TODO: support `@type_check_only` decorator if isinstance(runtime, Missing): yield Error(object_path, "is not present at runtime", stub, runtime) return @@ -1215,6 +1251,12 @@ def _resolve_funcitem_from_decorator(dec: nodes.OverloadPart) -> nodes.FuncItem def apply_decorator_to_funcitem( decorator: nodes.Expression, func: nodes.FuncItem ) -> nodes.FuncItem | None: + if ( + isinstance(decorator, nodes.CallExpr) + and isinstance(decorator.callee, nodes.RefExpr) + and decorator.callee.fullname in mypy.types.DEPRECATED_TYPE_NAMES + ): + return func if not isinstance(decorator, nodes.RefExpr): return None if not decorator.fullname: @@ -1223,6 +1265,7 @@ def apply_decorator_to_funcitem( if ( decorator.fullname in ("builtins.staticmethod", "abc.abstractmethod") or decorator.fullname in mypy.types.OVERLOAD_NAMES + or decorator.fullname in mypy.types.FINAL_DECORATOR_NAMES ): return func if decorator.fullname == "builtins.classmethod": @@ -1253,6 +1296,19 @@ def apply_decorator_to_funcitem( def verify_decorator( stub: nodes.Decorator, runtime: MaybeMissing[Any], object_path: list[str] ) -> Iterator[Error]: + if stub.func.is_type_check_only: + # This function only exists in stubs, we only check that the runtime part + # is missing. Other checks are not required. + if not isinstance(runtime, Missing): + yield Error( + object_path, + 'is marked as "@type_check_only", but also exists at runtime', + stub, + runtime, + stub_desc=repr(stub), + ) + return + if isinstance(runtime, Missing): yield Error(object_path, "is not present at runtime", stub, runtime) return @@ -1374,7 +1430,6 @@ def verify_typealias( "__annotations__", "__text_signature__", "__weakref__", - "__del__", # Only ever called when an object is being deleted, who cares? "__hash__", "__getattr__", # resulting behaviour might be typed explicitly "__setattr__", # defining this on a class can cause worse type checking @@ -1440,7 +1495,27 @@ def is_read_only_property(runtime: object) -> bool: def safe_inspect_signature(runtime: Any) -> inspect.Signature | None: try: - return inspect.signature(runtime) + try: + return inspect.signature(runtime) + except ValueError: + if ( + hasattr(runtime, "__text_signature__") + and "" in runtime.__text_signature__ + ): + # Try to fix up the signature. Workaround for + # https://github.com/python/cpython/issues/87233 + sig = runtime.__text_signature__.replace("", "...") + sig = inspect._signature_fromstr(inspect.Signature, runtime, sig) # type: ignore[attr-defined] + assert isinstance(sig, inspect.Signature) + new_params = [ + parameter.replace(default=UNREPRESENTABLE) + if parameter.default is ... + else parameter + for parameter in sig.parameters.values() + ] + return sig.replace(parameters=new_params) + else: + raise except Exception: # inspect.signature throws ValueError all the time # catch RuntimeError because of https://bugs.python.org/issue39504 @@ -1449,6 +1524,10 @@ def safe_inspect_signature(runtime: Any) -> inspect.Signature | None: return None +def describe_runtime_callable(signature: inspect.Signature, *, is_async: bool) -> str: + return f'{"async " if is_async else ""}def {signature}' + + def is_subtype_helper(left: mypy.types.Type, right: mypy.types.Type) -> bool: """Checks whether ``left`` is a subtype of ``right``.""" left = mypy.types.get_proper_type(left) diff --git a/mypy/stubutil.py b/mypy/stubutil.py index cc3b63098fd2..b8d601ed3c6b 100644 --- a/mypy/stubutil.py +++ b/mypy/stubutil.py @@ -576,6 +576,14 @@ def __init__( self.sig_generators = self.get_sig_generators() # populated by visit_mypy_file self.module_name: str = "" + # These are "soft" imports for objects which might appear in annotations but not have + # a corresponding import statement. + self.known_imports = { + "_typeshed": ["Incomplete"], + "typing": ["Any", "TypeVar", "NamedTuple"], + "collections.abc": ["Generator"], + "typing_extensions": ["TypedDict", "ParamSpec", "TypeVarTuple"], + } def get_sig_generators(self) -> list[SignatureGenerator]: return [] @@ -614,10 +622,24 @@ def get_imports(self) -> str: def output(self) -> str: """Return the text for the stub.""" - imports = self.get_imports() - if imports and self._output: - imports += "\n" - return imports + "".join(self._output) + pieces: list[str] = [] + if imports := self.get_imports(): + pieces.append(imports) + if dunder_all := self.get_dunder_all(): + pieces.append(dunder_all) + if self._output: + pieces.append("".join(self._output)) + return "\n".join(pieces) + + def get_dunder_all(self) -> str: + """Return the __all__ list for the stub.""" + if self._all_: + # Note we emit all names in the runtime __all__ here, even if they + # don't actually exist. If that happens, the runtime has a bug, and + # it's not obvious what the correct behavior should be. We choose + # to reflect the runtime __all__ as closely as possible. + return f"__all__ = {self._all_!r}\n" + return "" def add(self, string: str) -> None: """Add text to generated stub.""" @@ -651,18 +673,9 @@ def set_defined_names(self, defined_names: set[str]) -> None: self.defined_names = defined_names # Names in __all__ are required for name in self._all_ or (): - if name not in self.IGNORED_DUNDERS: - self.import_tracker.reexport(name) + self.import_tracker.reexport(name) - # These are "soft" imports for objects which might appear in annotations but not have - # a corresponding import statement. - known_imports = { - "_typeshed": ["Incomplete"], - "typing": ["Any", "TypeVar", "NamedTuple"], - "collections.abc": ["Generator"], - "typing_extensions": ["TypedDict", "ParamSpec", "TypeVarTuple"], - } - for pkg, imports in known_imports.items(): + for pkg, imports in self.known_imports.items(): for t in imports: # require=False means that the import won't be added unless require_name() is called # for the object during generation. @@ -751,7 +764,13 @@ def is_private_name(self, name: str, fullname: str | None = None) -> bool: return False if name == "_": return False - return name.startswith("_") and (not name.endswith("__") or name in self.IGNORED_DUNDERS) + if not name.startswith("_"): + return False + if self._all_ and name in self._all_: + return False + if name.startswith("__") and name.endswith("__"): + return name in self.IGNORED_DUNDERS + return True def should_reexport(self, name: str, full_module: str, name_is_alias: bool) -> bool: if ( @@ -761,18 +780,21 @@ def should_reexport(self, name: str, full_module: str, name_is_alias: bool) -> b ): # Special case certain names that should be exported, against our general rules. return True + if name_is_alias: + return False + if self.export_less: + return False + if not self.module_name: + return False is_private = self.is_private_name(name, full_module + "." + name) + if is_private: + return False top_level = full_module.split(".")[0] self_top_level = self.module_name.split(".", 1)[0] - if ( - not name_is_alias - and not self.export_less - and (not self._all_ or name in self.IGNORED_DUNDERS) - and self.module_name - and not is_private - and top_level in (self_top_level, "_" + self_top_level) - ): + if top_level not in (self_top_level, "_" + self_top_level): # Export imports from the same package, since we can't reliably tell whether they # are part of the public API. - return True - return False + return False + if self._all_: + return name in self._all_ + return True diff --git a/mypy/subtypes.py b/mypy/subtypes.py index 7e37751b1c15..4fd3f8ff98ca 100644 --- a/mypy/subtypes.py +++ b/mypy/subtypes.py @@ -1651,7 +1651,12 @@ def _incompatible(left_arg: FormalArgument | None, right_arg: FormalArgument | N continue return False if not are_args_compatible( - left_arg, right_arg, ignore_pos_arg_names, allow_partial_overlap, is_compat + left_arg, + right_arg, + is_compat, + ignore_pos_arg_names=ignore_pos_arg_names, + allow_partial_overlap=allow_partial_overlap, + allow_imprecise_kinds=right.imprecise_arg_kinds, ): return False @@ -1676,9 +1681,9 @@ def _incompatible(left_arg: FormalArgument | None, right_arg: FormalArgument | N if not are_args_compatible( left_by_position, right_by_position, - ignore_pos_arg_names, - allow_partial_overlap, is_compat, + ignore_pos_arg_names=ignore_pos_arg_names, + allow_partial_overlap=allow_partial_overlap, ): return False i += 1 @@ -1711,7 +1716,11 @@ def _incompatible(left_arg: FormalArgument | None, right_arg: FormalArgument | N continue if not are_args_compatible( - left_by_name, right_by_name, ignore_pos_arg_names, allow_partial_overlap, is_compat + left_by_name, + right_by_name, + is_compat, + ignore_pos_arg_names=ignore_pos_arg_names, + allow_partial_overlap=allow_partial_overlap, ): return False @@ -1735,6 +1744,7 @@ def _incompatible(left_arg: FormalArgument | None, right_arg: FormalArgument | N and right_by_name != right_by_pos and (right_by_pos.required or right_by_name.required) and strict_concatenate_check + and not right.imprecise_arg_kinds ): return False @@ -1749,9 +1759,11 @@ def _incompatible(left_arg: FormalArgument | None, right_arg: FormalArgument | N def are_args_compatible( left: FormalArgument, right: FormalArgument, + is_compat: Callable[[Type, Type], bool], + *, ignore_pos_arg_names: bool, allow_partial_overlap: bool, - is_compat: Callable[[Type, Type], bool], + allow_imprecise_kinds: bool = False, ) -> bool: if left.required and right.required: # If both arguments are required allow_partial_overlap has no effect. @@ -1779,7 +1791,7 @@ def is_different(left_item: object | None, right_item: object | None) -> bool: return False # If right is at a specific position, left must have the same: - if is_different(left.pos, right.pos): + if is_different(left.pos, right.pos) and not allow_imprecise_kinds: return False # If right's argument is optional, left's must also be diff --git a/mypy/test/testerrorstream.py b/mypy/test/testerrorstream.py index 4b98f10fc9ca..5ed112fd31e7 100644 --- a/mypy/test/testerrorstream.py +++ b/mypy/test/testerrorstream.py @@ -29,7 +29,7 @@ def test_error_stream(testcase: DataDrivenTestCase) -> None: logged_messages: list[str] = [] - def flush_errors(msgs: list[str], serious: bool) -> None: + def flush_errors(filename: str | None, msgs: list[str], serious: bool) -> None: if msgs: logged_messages.append("==== Errors flushed ====") logged_messages.extend(msgs) diff --git a/mypy/test/testfinegrained.py b/mypy/test/testfinegrained.py index 953f91a60df7..f61a58c425fc 100644 --- a/mypy/test/testfinegrained.py +++ b/mypy/test/testfinegrained.py @@ -149,6 +149,7 @@ def get_options(self, source: str, testcase: DataDrivenTestCase, build_cache: bo options.use_fine_grained_cache = self.use_cache and not build_cache options.cache_fine_grained = self.use_cache options.local_partial_types = True + options.export_types = "inspect" in testcase.file # Treat empty bodies safely for these test cases. options.allow_empty_bodies = not testcase.name.endswith("_no_empty") if re.search("flags:.*--follow-imports", source) is None: @@ -163,7 +164,7 @@ def get_options(self, source: str, testcase: DataDrivenTestCase, build_cache: bo return options def run_check(self, server: Server, sources: list[BuildSource]) -> list[str]: - response = server.check(sources, export_types=True, is_tty=False, terminal_width=-1) + response = server.check(sources, export_types=False, is_tty=False, terminal_width=-1) out = response["out"] or response["err"] assert isinstance(out, str) return out.splitlines() diff --git a/mypy/test/testgraph.py b/mypy/test/testgraph.py index b0d148d5ae9c..0355e75e8c34 100644 --- a/mypy/test/testgraph.py +++ b/mypy/test/testgraph.py @@ -50,7 +50,7 @@ def _make_manager(self) -> BuildManager: plugin=Plugin(options), plugins_snapshot={}, errors=errors, - flush_errors=lambda msgs, serious: None, + flush_errors=lambda filename, msgs, serious: None, fscache=fscache, stdout=sys.stdout, stderr=sys.stderr, diff --git a/mypy/test/testipc.py b/mypy/test/testipc.py index 8ef656dc4579..0224035a7b61 100644 --- a/mypy/test/testipc.py +++ b/mypy/test/testipc.py @@ -2,7 +2,7 @@ import sys import time -from multiprocessing import Process, Queue +from multiprocessing import Queue, get_context from unittest import TestCase, main import pytest @@ -35,10 +35,17 @@ def server_multi_message_echo(q: Queue[str]) -> None: class IPCTests(TestCase): + def setUp(self) -> None: + if sys.platform == "linux": + # The default "fork" start method is potentially unsafe + self.ctx = get_context("forkserver") + else: + self.ctx = get_context("spawn") + def test_transaction_large(self) -> None: - queue: Queue[str] = Queue() + queue: Queue[str] = self.ctx.Queue() msg = "t" * 200000 # longer than the max read size of 100_000 - p = Process(target=server, args=(msg, queue), daemon=True) + p = self.ctx.Process(target=server, args=(msg, queue), daemon=True) p.start() connection_name = queue.get() with IPCClient(connection_name, timeout=1) as client: @@ -49,9 +56,9 @@ def test_transaction_large(self) -> None: p.join() def test_connect_twice(self) -> None: - queue: Queue[str] = Queue() + queue: Queue[str] = self.ctx.Queue() msg = "this is a test message" - p = Process(target=server, args=(msg, queue), daemon=True) + p = self.ctx.Process(target=server, args=(msg, queue), daemon=True) p.start() connection_name = queue.get() with IPCClient(connection_name, timeout=1) as client: @@ -67,8 +74,8 @@ def test_connect_twice(self) -> None: assert p.exitcode == 0 def test_multiple_messages(self) -> None: - queue: Queue[str] = Queue() - p = Process(target=server_multi_message_echo, args=(queue,), daemon=True) + queue: Queue[str] = self.ctx.Queue() + p = self.ctx.Process(target=server_multi_message_echo, args=(queue,), daemon=True) p.start() connection_name = queue.get() with IPCClient(connection_name, timeout=1) as client: diff --git a/mypy/test/testparse.py b/mypy/test/testparse.py index 0140eb072821..e33fa7e53ff0 100644 --- a/mypy/test/testparse.py +++ b/mypy/test/testparse.py @@ -8,7 +8,7 @@ from mypy import defaults from mypy.config_parser import parse_mypy_comments -from mypy.errors import CompileError +from mypy.errors import CompileError, Errors from mypy.options import Options from mypy.parse import parse from mypy.test.data import DataDrivenTestCase, DataSuite @@ -51,7 +51,12 @@ def test_parser(testcase: DataDrivenTestCase) -> None: try: n = parse( - bytes(source, "ascii"), fnam="main", module="__main__", errors=None, options=options + bytes(source, "ascii"), + fnam="main", + module="__main__", + errors=Errors(options), + options=options, + raise_on_error=True, ) a = n.str_with_options(options).split("\n") except CompileError as e: @@ -82,7 +87,12 @@ def test_parse_error(testcase: DataDrivenTestCase) -> None: skip() # Compile temporary file. The test file contains non-ASCII characters. parse( - bytes("\n".join(testcase.input), "utf-8"), INPUT_FILE_NAME, "__main__", None, options + bytes("\n".join(testcase.input), "utf-8"), + INPUT_FILE_NAME, + "__main__", + errors=Errors(options), + options=options, + raise_on_error=True, ) raise AssertionError("No errors reported") except CompileError as e: diff --git a/mypy/test/teststubtest.py b/mypy/test/teststubtest.py index a52d9ef5de31..34b266115166 100644 --- a/mypy/test/teststubtest.py +++ b/mypy/test/teststubtest.py @@ -71,6 +71,8 @@ class Sequence(Iterable[_T_co]): ... class Tuple(Sequence[_T_co]): ... class NamedTuple(tuple[Any, ...]): ... def overload(func: _T) -> _T: ... +def type_check_only(func: _T) -> _T: ... +def final(func: _T) -> _T: ... """ stubtest_builtins_stub = """ @@ -426,6 +428,16 @@ def test_default_value(self) -> Iterator[Case]: error=None, ) + # Simulate "" + yield Case( + stub="def f11() -> None: ...", + runtime=""" + def f11(text=None) -> None: pass + f11.__text_signature__ = "(text=)" + """, + error="f11", + ) + @collect_cases def test_static_class_method(self) -> Iterator[Case]: yield Case( @@ -630,6 +642,24 @@ def f5(__b: str) -> str: ... runtime="def f5(x, /): pass", error=None, ) + yield Case( + stub=""" + from typing import final + from typing_extensions import deprecated + class Foo: + @overload + @final + def f6(self, __a: int) -> int: ... + @overload + @deprecated("evil") + def f6(self, __b: str) -> str: ... + """, + runtime=""" + class Foo: + def f6(self, x, /): pass + """, + error=None, + ) @collect_cases def test_property(self) -> Iterator[Case]: @@ -2027,6 +2057,72 @@ def some(self) -> int: ... error=None, ) + @collect_cases + def test_type_check_only(self) -> Iterator[Case]: + yield Case( + stub="from typing import type_check_only, overload", + runtime="from typing import overload", + error=None, + ) + # You can have public types that are only defined in stubs + # with `@type_check_only`: + yield Case( + stub=""" + @type_check_only + class A1: ... + """, + runtime="", + error=None, + ) + # Having `@type_check_only` on a type that exists at runtime is an error + yield Case( + stub=""" + @type_check_only + class A2: ... + """, + runtime="class A2: ...", + error="A2", + ) + # The same is true for NamedTuples and TypedDicts: + yield Case( + stub="from typing_extensions import NamedTuple, TypedDict", + runtime="from typing_extensions import NamedTuple, TypedDict", + error=None, + ) + yield Case( + stub=""" + @type_check_only + class NT1(NamedTuple): ... + """, + runtime="class NT1(NamedTuple): ...", + error="NT1", + ) + yield Case( + stub=""" + @type_check_only + class TD1(TypedDict): ... + """, + runtime="class TD1(TypedDict): ...", + error="TD1", + ) + # The same is true for functions: + yield Case( + stub=""" + @type_check_only + def func1() -> None: ... + """, + runtime="", + error=None, + ) + yield Case( + stub=""" + @type_check_only + def func2() -> None: ... + """, + runtime="def func2() -> None: ...", + error="func2", + ) + def remove_color_code(s: str) -> str: return re.sub("\\x1b.*?m", "", s) # this works! @@ -2195,6 +2291,14 @@ def f(a: int, b: int, *, c: int, d: int = 0, **kwargs: Any) -> None: == "def (a, b, *, c, d = ..., **kwargs)" ) + def test_builtin_signature_with_unrepresentable_default(self) -> None: + sig = mypy.stubtest.safe_inspect_signature(bytes.hex) + assert sig is not None + assert ( + str(mypy.stubtest.Signature.from_inspect_signature(sig)) + == "def (self, sep = ..., bytes_per_sep = ...)" + ) + def test_config_file(self) -> None: runtime = "temp = 5\n" stub = "from decimal import Decimal\ntemp: Decimal\n" diff --git a/mypy/test/testutil.py b/mypy/test/testutil.py index 571e4d0b11f2..d0d54ffec8c6 100644 --- a/mypy/test/testutil.py +++ b/mypy/test/testutil.py @@ -4,7 +4,7 @@ from unittest import TestCase, mock from mypy.inspections import parse_location -from mypy.util import get_terminal_width +from mypy.util import _generate_junit_contents, get_terminal_width class TestGetTerminalSize(TestCase): @@ -20,3 +20,70 @@ def test_get_terminal_size_in_pty_defaults_to_80(self) -> None: def test_parse_location_windows(self) -> None: assert parse_location(r"C:\test.py:1:1") == (r"C:\test.py", [1, 1]) assert parse_location(r"C:\test.py:1:1:1:1") == (r"C:\test.py", [1, 1, 1, 1]) + + +class TestWriteJunitXml(TestCase): + def test_junit_pass(self) -> None: + serious = False + messages_by_file: dict[str | None, list[str]] = {} + expected = """ + + + + +""" + result = _generate_junit_contents( + dt=1.23, + serious=serious, + messages_by_file=messages_by_file, + version="3.14", + platform="test-plat", + ) + assert result == expected + + def test_junit_fail_two_files(self) -> None: + serious = False + messages_by_file: dict[str | None, list[str]] = { + "file1.py": ["Test failed", "another line"], + "file2.py": ["Another failure", "line 2"], + } + expected = """ + + + Test failed +another line + + + Another failure +line 2 + + +""" + result = _generate_junit_contents( + dt=1.23, + serious=serious, + messages_by_file=messages_by_file, + version="3.14", + platform="test-plat", + ) + assert result == expected + + def test_serious_error(self) -> None: + serious = True + messages_by_file: dict[str | None, list[str]] = {None: ["Error line 1", "Error line 2"]} + expected = """ + + + Error line 1 +Error line 2 + + +""" + result = _generate_junit_contents( + dt=1.23, + serious=serious, + messages_by_file=messages_by_file, + version="3.14", + platform="test-plat", + ) + assert result == expected diff --git a/mypy/typeops.py b/mypy/typeops.py index 2eb3b284e729..2bf8ffbf47ab 100644 --- a/mypy/typeops.py +++ b/mypy/typeops.py @@ -244,15 +244,15 @@ class C(D[E[T]], Generic[T]): ... return expand_type_by_instance(typ, inst_type) -def supported_self_type(typ: ProperType) -> bool: +def supported_self_type(typ: ProperType, allow_callable: bool = True) -> bool: """Is this a supported kind of explicit self-types? - Currently, this means a X or Type[X], where X is an instance or + Currently, this means an X or Type[X], where X is an instance or a type variable with an instance upper bound. """ if isinstance(typ, TypeType): return supported_self_type(typ.item) - if isinstance(typ, CallableType): + if allow_callable and isinstance(typ, CallableType): # Special case: allow class callable instead of Type[...] as cls annotation, # as well as callable self for callback protocols. return True @@ -306,7 +306,11 @@ class B(A): pass self_param_type = get_proper_type(func.arg_types[0]) variables: Sequence[TypeVarLikeType] - if func.variables and supported_self_type(self_param_type): + # Having a def __call__(self: Callable[...], ...) can cause infinite recursion. Although + # this special-casing looks not very principled, there is nothing meaningful we can infer + # from such definition, since it is inherently indefinitely recursive. + allow_callable = func.name is None or not func.name.startswith("__call__ of") + if func.variables and supported_self_type(self_param_type, allow_callable=allow_callable): from mypy.infer import infer_type_arguments if original_type is None: @@ -565,15 +569,15 @@ def _remove_redundant_union_items(items: list[Type], keep_erased: bool) -> list[ return items -def _get_type_special_method_bool_ret_type(t: Type) -> Type | None: +def _get_type_method_ret_type(t: Type, *, name: str) -> Type | None: t = get_proper_type(t) if isinstance(t, Instance): - bool_method = t.type.get("__bool__") - if bool_method: - callee = get_proper_type(bool_method.type) - if isinstance(callee, CallableType): - return callee.ret_type + sym = t.type.get(name) + if sym: + sym_type = get_proper_type(sym.type) + if isinstance(sym_type, CallableType): + return sym_type.ret_type return None @@ -596,7 +600,9 @@ def true_only(t: Type) -> ProperType: can_be_true_items = [item for item in new_items if item.can_be_true] return make_simplified_union(can_be_true_items, line=t.line, column=t.column) else: - ret_type = _get_type_special_method_bool_ret_type(t) + ret_type = _get_type_method_ret_type(t, name="__bool__") or _get_type_method_ret_type( + t, name="__len__" + ) if ret_type and not ret_type.can_be_true: return UninhabitedType(line=t.line, column=t.column) @@ -629,9 +635,14 @@ def false_only(t: Type) -> ProperType: can_be_false_items = [item for item in new_items if item.can_be_false] return make_simplified_union(can_be_false_items, line=t.line, column=t.column) else: - ret_type = _get_type_special_method_bool_ret_type(t) + ret_type = _get_type_method_ret_type(t, name="__bool__") or _get_type_method_ret_type( + t, name="__len__" + ) - if ret_type and not ret_type.can_be_false: + if ret_type: + if not ret_type.can_be_false: + return UninhabitedType(line=t.line) + elif isinstance(t, Instance) and t.type.is_final: return UninhabitedType(line=t.line) new_t = copy_type(t) diff --git a/mypy/types.py b/mypy/types.py index 43003a9a22b6..d19766c1de34 100644 --- a/mypy/types.py +++ b/mypy/types.py @@ -113,12 +113,18 @@ # Supported @final decorator names. FINAL_DECORATOR_NAMES: Final = ("typing.final", "typing_extensions.final") +# Supported @type_check_only names. +TYPE_CHECK_ONLY_NAMES: Final = ("typing.type_check_only", "typing_extensions.type_check_only") + # Supported Literal type names. LITERAL_TYPE_NAMES: Final = ("typing.Literal", "typing_extensions.Literal") # Supported Annotated type names. ANNOTATED_TYPE_NAMES: Final = ("typing.Annotated", "typing_extensions.Annotated") +# Supported @deprecated type names +DEPRECATED_TYPE_NAMES: Final = ("warnings.deprecated", "typing_extensions.deprecated") + # We use this constant in various places when checking `tuple` subtyping: TUPLE_LIKE_INSTANCE_NAMES: Final = ( "builtins.tuple", diff --git a/mypy/typeshed/stdlib/VERSIONS b/mypy/typeshed/stdlib/VERSIONS index 9d4636a29a1d..d24e85c8fe44 100644 --- a/mypy/typeshed/stdlib/VERSIONS +++ b/mypy/typeshed/stdlib/VERSIONS @@ -152,8 +152,12 @@ imp: 2.7-3.11 importlib: 2.7- importlib.metadata: 3.8- importlib.metadata._meta: 3.10- +importlib.readers: 3.10- importlib.resources: 3.7- importlib.resources.abc: 3.11- +importlib.resources.readers: 3.11- +importlib.resources.simple: 3.11- +importlib.simple: 3.11- inspect: 2.7- io: 2.7- ipaddress: 3.3- @@ -181,6 +185,7 @@ multiprocessing.shared_memory: 3.8- netrc: 2.7- nis: 2.7- nntplib: 2.7- +nt: 2.7- ntpath: 2.7- nturl2path: 2.7- numbers: 2.7- @@ -250,6 +255,7 @@ sunau: 2.7- symbol: 2.7-3.9 symtable: 2.7- sys: 2.7- +sys._monitoring: 3.12- # Doesn't actually exist. See comments in the stub. sysconfig: 2.7- syslog: 2.7- tabnanny: 2.7- diff --git a/mypy/typeshed/stdlib/_ast.pyi b/mypy/typeshed/stdlib/_ast.pyi index 402b770c0462..0302133fc6f9 100644 --- a/mypy/typeshed/stdlib/_ast.pyi +++ b/mypy/typeshed/stdlib/_ast.pyi @@ -553,7 +553,7 @@ if sys.version_info >= (3, 10): class MatchSingleton(pattern): __match_args__ = ("value",) - value: Literal[True, False, None] + value: Literal[True, False] | None class MatchSequence(pattern): __match_args__ = ("patterns",) diff --git a/mypy/typeshed/stdlib/_codecs.pyi b/mypy/typeshed/stdlib/_codecs.pyi index 51f17f01ca71..f8141d8bad4b 100644 --- a/mypy/typeshed/stdlib/_codecs.pyi +++ b/mypy/typeshed/stdlib/_codecs.pyi @@ -47,11 +47,11 @@ _StrToStrEncoding: TypeAlias = Literal["rot13", "rot_13"] @overload def encode(obj: ReadableBuffer, encoding: _BytesToBytesEncoding, errors: str = "strict") -> bytes: ... @overload -def encode(obj: str, encoding: _StrToStrEncoding, errors: str = "strict") -> str: ... # type: ignore[misc] +def encode(obj: str, encoding: _StrToStrEncoding, errors: str = "strict") -> str: ... # type: ignore[overload-overlap] @overload def encode(obj: str, encoding: str = "utf-8", errors: str = "strict") -> bytes: ... @overload -def decode(obj: ReadableBuffer, encoding: _BytesToBytesEncoding, errors: str = "strict") -> bytes: ... # type: ignore[misc] +def decode(obj: ReadableBuffer, encoding: _BytesToBytesEncoding, errors: str = "strict") -> bytes: ... # type: ignore[overload-overlap] @overload def decode(obj: str, encoding: _StrToStrEncoding, errors: str = "strict") -> str: ... diff --git a/mypy/typeshed/stdlib/_collections_abc.pyi b/mypy/typeshed/stdlib/_collections_abc.pyi index 2b57f157a0e4..8520e9e4ed9b 100644 --- a/mypy/typeshed/stdlib/_collections_abc.pyi +++ b/mypy/typeshed/stdlib/_collections_abc.pyi @@ -81,7 +81,7 @@ class dict_values(ValuesView[_VT_co], Generic[_KT_co, _VT_co]): # undocumented def mapping(self) -> MappingProxyType[_KT_co, _VT_co]: ... @final -class dict_items(ItemsView[_KT_co, _VT_co], Generic[_KT_co, _VT_co]): # undocumented +class dict_items(ItemsView[_KT_co, _VT_co]): # undocumented def __eq__(self, __value: object) -> bool: ... if sys.version_info >= (3, 10): @property diff --git a/mypy/typeshed/stdlib/_locale.pyi b/mypy/typeshed/stdlib/_locale.pyi index 2b2fe03e4510..d7399f15e1a3 100644 --- a/mypy/typeshed/stdlib/_locale.pyi +++ b/mypy/typeshed/stdlib/_locale.pyi @@ -1,6 +1,6 @@ import sys from _typeshed import StrPath -from collections.abc import Iterable, Mapping +from collections.abc import Mapping LC_CTYPE: int LC_COLLATE: int @@ -10,7 +10,7 @@ LC_NUMERIC: int LC_ALL: int CHAR_MAX: int -def setlocale(category: int, locale: str | Iterable[str | None] | None = None) -> str: ... +def setlocale(__category: int, __locale: str | None = None) -> str: ... def localeconv() -> Mapping[str, int | str | list[int]]: ... if sys.version_info >= (3, 11): diff --git a/mypy/typeshed/stdlib/_typeshed/__init__.pyi b/mypy/typeshed/stdlib/_typeshed/__init__.pyi index 8e92138c748a..33659cf31a12 100644 --- a/mypy/typeshed/stdlib/_typeshed/__init__.pyi +++ b/mypy/typeshed/stdlib/_typeshed/__init__.pyi @@ -47,7 +47,7 @@ Unused: TypeAlias = object # _SentinelType = NewType("_SentinelType", object) # sentinel: _SentinelType # def foo(x: int | None | _SentinelType = ...) -> None: ... -sentinel = Any # noqa: Y026 +sentinel: Any # stable class IdentityFunction(Protocol): @@ -236,6 +236,10 @@ class SupportsNoArgReadline(Protocol[_T_co]): class SupportsWrite(Protocol[_T_contra]): def write(self, __s: _T_contra) -> object: ... +# stable +class SupportsFlush(Protocol): + def flush(self) -> object: ... + # Unfortunately PEP 688 does not allow us to distinguish read-only # from writable buffers. We use these aliases for readability for now. # Perhaps a future extension of the buffer protocol will allow us to diff --git a/mypy/typeshed/stdlib/_typeshed/wsgi.pyi b/mypy/typeshed/stdlib/_typeshed/wsgi.pyi index de731aea918b..e8ebf6409e7f 100644 --- a/mypy/typeshed/stdlib/_typeshed/wsgi.pyi +++ b/mypy/typeshed/stdlib/_typeshed/wsgi.pyi @@ -11,7 +11,7 @@ from typing import Any, Protocol from typing_extensions import TypeAlias class _Readable(Protocol): - def read(self, size: int = ...) -> bytes: ... + def read(self, __size: int = ...) -> bytes: ... # Optional: def close(self) -> object: ... if sys.version_info >= (3, 11): diff --git a/mypy/typeshed/stdlib/_typeshed/xml.pyi b/mypy/typeshed/stdlib/_typeshed/xml.pyi index 231c2b86e912..46c5fab097c4 100644 --- a/mypy/typeshed/stdlib/_typeshed/xml.pyi +++ b/mypy/typeshed/stdlib/_typeshed/xml.pyi @@ -4,6 +4,6 @@ from typing import Any, Protocol # As defined https://docs.python.org/3/library/xml.dom.html#domimplementation-objects class DOMImplementation(Protocol): - def hasFeature(self, feature: str, version: str | None) -> bool: ... - def createDocument(self, namespaceUri: str, qualifiedName: str, doctype: Any | None) -> Any: ... - def createDocumentType(self, qualifiedName: str, publicId: str, systemId: str) -> Any: ... + def hasFeature(self, __feature: str, __version: str | None) -> bool: ... + def createDocument(self, __namespaceUri: str, __qualifiedName: str, __doctype: Any | None) -> Any: ... + def createDocumentType(self, __qualifiedName: str, __publicId: str, __systemId: str) -> Any: ... diff --git a/mypy/typeshed/stdlib/_warnings.pyi b/mypy/typeshed/stdlib/_warnings.pyi index 0981dfeaafee..2e571e676c97 100644 --- a/mypy/typeshed/stdlib/_warnings.pyi +++ b/mypy/typeshed/stdlib/_warnings.pyi @@ -1,13 +1,36 @@ +import sys from typing import Any, overload _defaultaction: str _onceregistry: dict[Any, Any] filters: list[tuple[str, str | None, type[Warning], str | None, int]] -@overload -def warn(message: str, category: type[Warning] | None = None, stacklevel: int = 1, source: Any | None = None) -> None: ... -@overload -def warn(message: Warning, category: Any = None, stacklevel: int = 1, source: Any | None = None) -> None: ... +if sys.version_info >= (3, 12): + @overload + def warn( + message: str, + category: type[Warning] | None = None, + stacklevel: int = 1, + source: Any | None = None, + *, + skip_file_prefixes: tuple[str, ...] = (), + ) -> None: ... + @overload + def warn( + message: Warning, + category: Any = None, + stacklevel: int = 1, + source: Any | None = None, + *, + skip_file_prefixes: tuple[str, ...] = (), + ) -> None: ... + +else: + @overload + def warn(message: str, category: type[Warning] | None = None, stacklevel: int = 1, source: Any | None = None) -> None: ... + @overload + def warn(message: Warning, category: Any = None, stacklevel: int = 1, source: Any | None = None) -> None: ... + @overload def warn_explicit( message: str, diff --git a/mypy/typeshed/stdlib/_weakrefset.pyi b/mypy/typeshed/stdlib/_weakrefset.pyi index d73d79155329..6482ade1271e 100644 --- a/mypy/typeshed/stdlib/_weakrefset.pyi +++ b/mypy/typeshed/stdlib/_weakrefset.pyi @@ -1,6 +1,6 @@ import sys from collections.abc import Iterable, Iterator, MutableSet -from typing import Any, Generic, TypeVar, overload +from typing import Any, TypeVar, overload from typing_extensions import Self if sys.version_info >= (3, 9): @@ -11,7 +11,7 @@ __all__ = ["WeakSet"] _S = TypeVar("_S") _T = TypeVar("_T") -class WeakSet(MutableSet[_T], Generic[_T]): +class WeakSet(MutableSet[_T]): @overload def __init__(self, data: None = None) -> None: ... @overload diff --git a/mypy/typeshed/stdlib/_winapi.pyi b/mypy/typeshed/stdlib/_winapi.pyi index e887fb38a7fa..1aec6ce50443 100644 --- a/mypy/typeshed/stdlib/_winapi.pyi +++ b/mypy/typeshed/stdlib/_winapi.pyi @@ -54,7 +54,7 @@ if sys.platform == "win32": HIGH_PRIORITY_CLASS: Literal[0x80] INFINITE: Literal[0xFFFFFFFF] if sys.version_info >= (3, 8): - # Ignore the flake8 error -- flake8-pyi assumes + # Ignore the Flake8 error -- flake8-pyi assumes # most numbers this long will be implementation details, # but here we can see that it's a power of 2 INVALID_HANDLE_VALUE: Literal[0xFFFFFFFFFFFFFFFF] # noqa: Y054 diff --git a/mypy/typeshed/stdlib/argparse.pyi b/mypy/typeshed/stdlib/argparse.pyi index 924cc8986114..0cbbcd242195 100644 --- a/mypy/typeshed/stdlib/argparse.pyi +++ b/mypy/typeshed/stdlib/argparse.pyi @@ -120,7 +120,7 @@ class _ActionsContainer: def _handle_conflict_resolve(self, action: Action, conflicting_actions: Iterable[tuple[str, Action]]) -> None: ... class _FormatterClass(Protocol): - def __call__(self, prog: str) -> HelpFormatter: ... + def __call__(self, *, prog: str) -> HelpFormatter: ... class ArgumentParser(_AttributeHolder, _ActionsContainer): prog: str @@ -172,7 +172,7 @@ class ArgumentParser(_AttributeHolder, _ActionsContainer): ) -> None: ... @overload - def parse_args(self, args: Sequence[str] | None = None, namespace: None = None) -> Namespace: ... # type: ignore[misc] + def parse_args(self, args: Sequence[str] | None = None, namespace: None = None) -> Namespace: ... @overload def parse_args(self, args: Sequence[str] | None, namespace: _N) -> _N: ... @overload @@ -211,7 +211,7 @@ class ArgumentParser(_AttributeHolder, _ActionsContainer): def format_usage(self) -> str: ... def format_help(self) -> str: ... @overload - def parse_known_args(self, args: Sequence[str] | None = None, namespace: None = None) -> tuple[Namespace, list[str]]: ... # type: ignore[misc] + def parse_known_args(self, args: Sequence[str] | None = None, namespace: None = None) -> tuple[Namespace, list[str]]: ... @overload def parse_known_args(self, args: Sequence[str] | None, namespace: _N) -> tuple[_N, list[str]]: ... @overload @@ -220,13 +220,15 @@ class ArgumentParser(_AttributeHolder, _ActionsContainer): def exit(self, status: int = 0, message: str | None = None) -> NoReturn: ... def error(self, message: str) -> NoReturn: ... @overload - def parse_intermixed_args(self, args: Sequence[str] | None = None, namespace: None = None) -> Namespace: ... # type: ignore[misc] + def parse_intermixed_args(self, args: Sequence[str] | None = None, namespace: None = None) -> Namespace: ... @overload def parse_intermixed_args(self, args: Sequence[str] | None, namespace: _N) -> _N: ... @overload def parse_intermixed_args(self, *, namespace: _N) -> _N: ... @overload - def parse_known_intermixed_args(self, args: Sequence[str] | None = None, namespace: None = None) -> tuple[Namespace, list[str]]: ... # type: ignore[misc] + def parse_known_intermixed_args( + self, args: Sequence[str] | None = None, namespace: None = None + ) -> tuple[Namespace, list[str]]: ... @overload def parse_known_intermixed_args(self, args: Sequence[str] | None, namespace: _N) -> tuple[_N, list[str]]: ... @overload diff --git a/mypy/typeshed/stdlib/array.pyi b/mypy/typeshed/stdlib/array.pyi index b533f9240073..2ef821fcf87a 100644 --- a/mypy/typeshed/stdlib/array.pyi +++ b/mypy/typeshed/stdlib/array.pyi @@ -3,7 +3,7 @@ from _typeshed import ReadableBuffer, SupportsRead, SupportsWrite from collections.abc import Iterable # pytype crashes if array inherits from collections.abc.MutableSequence instead of typing.MutableSequence -from typing import Any, Generic, MutableSequence, TypeVar, overload # noqa: Y022 +from typing import Any, MutableSequence, TypeVar, overload # noqa: Y022 from typing_extensions import Literal, Self, SupportsIndex, TypeAlias if sys.version_info >= (3, 12): @@ -18,7 +18,7 @@ _T = TypeVar("_T", int, float, str) typecodes: str -class array(MutableSequence[_T], Generic[_T]): +class array(MutableSequence[_T]): @property def typecode(self) -> _TypeCode: ... @property diff --git a/mypy/typeshed/stdlib/ast.pyi b/mypy/typeshed/stdlib/ast.pyi index a61b4e35fd56..5c9cafc189be 100644 --- a/mypy/typeshed/stdlib/ast.pyi +++ b/mypy/typeshed/stdlib/ast.pyi @@ -4,27 +4,30 @@ from _ast import * from _typeshed import ReadableBuffer, Unused from collections.abc import Iterator from typing import Any, TypeVar as _TypeVar, overload -from typing_extensions import Literal +from typing_extensions import Literal, deprecated if sys.version_info >= (3, 8): class _ABC(type): if sys.version_info >= (3, 9): def __init__(cls, *args: Unused) -> None: ... + @deprecated("Replaced by ast.Constant; removal scheduled for Python 3.14") class Num(Constant, metaclass=_ABC): value: int | float | complex - + @deprecated("Replaced by ast.Constant; removal scheduled for Python 3.14") class Str(Constant, metaclass=_ABC): value: str # Aliases for value, for backwards compatibility s: str - + @deprecated("Replaced by ast.Constant; removal scheduled for Python 3.14") class Bytes(Constant, metaclass=_ABC): value: bytes # Aliases for value, for backwards compatibility s: bytes - + @deprecated("Replaced by ast.Constant; removal scheduled for Python 3.14") class NameConstant(Constant, metaclass=_ABC): ... + + @deprecated("Replaced by ast.Constant; removal scheduled for Python 3.14") class Ellipsis(Constant, metaclass=_ABC): ... if sys.version_info >= (3, 9): diff --git a/mypy/typeshed/stdlib/asyncio/base_events.pyi b/mypy/typeshed/stdlib/asyncio/base_events.pyi index e2b55da8c718..afddcd918584 100644 --- a/mypy/typeshed/stdlib/asyncio/base_events.pyi +++ b/mypy/typeshed/stdlib/asyncio/base_events.pyi @@ -423,7 +423,7 @@ class BaseEventLoop(AbstractEventLoop): bufsize: Literal[0] = 0, encoding: None = None, errors: None = None, - text: Literal[False, None] = None, + text: Literal[False] | None = None, **kwargs: Any, ) -> tuple[SubprocessTransport, _ProtocolT]: ... async def subprocess_exec( @@ -471,3 +471,5 @@ class BaseEventLoop(AbstractEventLoop): async def shutdown_default_executor(self, timeout: float | None = None) -> None: ... elif sys.version_info >= (3, 9): async def shutdown_default_executor(self) -> None: ... + + def __del__(self) -> None: ... diff --git a/mypy/typeshed/stdlib/asyncio/base_subprocess.pyi b/mypy/typeshed/stdlib/asyncio/base_subprocess.pyi index 8f262cd5c760..a5fe24e8768b 100644 --- a/mypy/typeshed/stdlib/asyncio/base_subprocess.pyi +++ b/mypy/typeshed/stdlib/asyncio/base_subprocess.pyi @@ -55,6 +55,7 @@ class BaseSubprocessTransport(transports.SubprocessTransport): async def _wait(self) -> int: ... # undocumented def _try_finish(self) -> None: ... # undocumented def _call_connection_lost(self, exc: BaseException | None) -> None: ... # undocumented + def __del__(self) -> None: ... class WriteSubprocessPipeProto(protocols.BaseProtocol): # undocumented def __init__(self, proc: BaseSubprocessTransport, fd: int) -> None: ... diff --git a/mypy/typeshed/stdlib/asyncio/events.pyi b/mypy/typeshed/stdlib/asyncio/events.pyi index cde63b279b0d..87e7edb461ac 100644 --- a/mypy/typeshed/stdlib/asyncio/events.pyi +++ b/mypy/typeshed/stdlib/asyncio/events.pyi @@ -6,7 +6,7 @@ from collections.abc import Callable, Coroutine, Generator, Sequence from contextvars import Context from socket import AddressFamily, SocketKind, _Address, _RetAddress, socket from typing import IO, Any, Protocol, TypeVar, overload -from typing_extensions import Literal, Self, TypeAlias +from typing_extensions import Literal, Self, TypeAlias, deprecated from . import _AwaitableLike, _CoroutineLike from .base_events import Server @@ -522,7 +522,7 @@ class AbstractEventLoop: bufsize: Literal[0] = 0, encoding: None = None, errors: None = None, - text: Literal[False, None] = ..., + text: Literal[False] | None = ..., **kwargs: Any, ) -> tuple[SubprocessTransport, _ProtocolT]: ... @abstractmethod @@ -613,8 +613,17 @@ def set_event_loop_policy(policy: AbstractEventLoopPolicy | None) -> None: ... def get_event_loop() -> AbstractEventLoop: ... def set_event_loop(loop: AbstractEventLoop | None) -> None: ... def new_event_loop() -> AbstractEventLoop: ... -def get_child_watcher() -> AbstractChildWatcher: ... -def set_child_watcher(watcher: AbstractChildWatcher) -> None: ... + +if sys.version_info >= (3, 12): + @deprecated("Deprecated as of Python 3.12; will be removed in Python 3.14") + def get_child_watcher() -> AbstractChildWatcher: ... + @deprecated("Deprecated as of Python 3.12; will be removed in Python 3.14") + def set_child_watcher(watcher: AbstractChildWatcher) -> None: ... + +else: + def get_child_watcher() -> AbstractChildWatcher: ... + def set_child_watcher(watcher: AbstractChildWatcher) -> None: ... + def _set_running_loop(__loop: AbstractEventLoop | None) -> None: ... def _get_running_loop() -> AbstractEventLoop: ... def get_running_loop() -> AbstractEventLoop: ... diff --git a/mypy/typeshed/stdlib/asyncio/proactor_events.pyi b/mypy/typeshed/stdlib/asyncio/proactor_events.pyi index 33fdf84ade4a..4634bbb2b37c 100644 --- a/mypy/typeshed/stdlib/asyncio/proactor_events.pyi +++ b/mypy/typeshed/stdlib/asyncio/proactor_events.pyi @@ -1,19 +1,13 @@ import sys from collections.abc import Mapping from socket import socket -from typing import Any, ClassVar, Protocol +from typing import Any, ClassVar from typing_extensions import Literal from . import base_events, constants, events, futures, streams, transports __all__ = ("BaseProactorEventLoop",) -if sys.version_info >= (3, 8): - class _WarnCallbackProtocol(Protocol): - def __call__( - self, message: str, category: type[Warning] | None = ..., stacklevel: int = ..., source: Any | None = ... - ) -> object: ... - class _ProactorBasePipeTransport(transports._FlowControlMixin, transports.BaseTransport): def __init__( self, @@ -24,10 +18,7 @@ class _ProactorBasePipeTransport(transports._FlowControlMixin, transports.BaseTr extra: Mapping[Any, Any] | None = None, server: events.AbstractServer | None = None, ) -> None: ... - if sys.version_info >= (3, 8): - def __del__(self, _warn: _WarnCallbackProtocol = ...) -> None: ... - else: - def __del__(self) -> None: ... + def __del__(self) -> None: ... class _ProactorReadPipeTransport(_ProactorBasePipeTransport, transports.ReadTransport): if sys.version_info >= (3, 10): diff --git a/mypy/typeshed/stdlib/asyncio/sslproto.pyi b/mypy/typeshed/stdlib/asyncio/sslproto.pyi index 09733e5f9a01..393a1fbdc468 100644 --- a/mypy/typeshed/stdlib/asyncio/sslproto.pyi +++ b/mypy/typeshed/stdlib/asyncio/sslproto.pyi @@ -83,6 +83,8 @@ class _SSLProtocolTransport(transports._FlowControlMixin, transports.Transport): def set_read_buffer_limits(self, high: int | None = None, low: int | None = None) -> None: ... def get_read_buffer_size(self) -> int: ... + def __del__(self) -> None: ... + if sys.version_info >= (3, 11): _SSLProtocolBase: TypeAlias = protocols.BufferedProtocol else: diff --git a/mypy/typeshed/stdlib/asyncio/streams.pyi b/mypy/typeshed/stdlib/asyncio/streams.pyi index 804be1ca5065..81a94425f8de 100644 --- a/mypy/typeshed/stdlib/asyncio/streams.pyi +++ b/mypy/typeshed/stdlib/asyncio/streams.pyi @@ -128,6 +128,7 @@ class StreamReaderProtocol(FlowControlMixin, protocols.Protocol): client_connected_cb: _ClientConnectedCallback | None = None, loop: events.AbstractEventLoop | None = None, ) -> None: ... + def __del__(self) -> None: ... class StreamWriter: def __init__( @@ -161,6 +162,8 @@ class StreamWriter: async def start_tls( self, sslcontext: ssl.SSLContext, *, server_hostname: str | None = None, ssl_handshake_timeout: float | None = None ) -> None: ... + if sys.version_info >= (3, 11): + def __del__(self) -> None: ... class StreamReader(AsyncIterator[bytes]): def __init__(self, limit: int = 65536, loop: events.AbstractEventLoop | None = None) -> None: ... diff --git a/mypy/typeshed/stdlib/asyncio/subprocess.pyi b/mypy/typeshed/stdlib/asyncio/subprocess.pyi index b8877b360527..03aea65f6d54 100644 --- a/mypy/typeshed/stdlib/asyncio/subprocess.pyi +++ b/mypy/typeshed/stdlib/asyncio/subprocess.pyi @@ -54,7 +54,7 @@ if sys.version_info >= (3, 11): bufsize: Literal[0] = 0, encoding: None = None, errors: None = None, - text: Literal[False, None] = None, + text: Literal[False] | None = None, # These parameters are taken by subprocess.Popen, which this ultimately delegates to executable: StrOrBytesPath | None = None, preexec_fn: Callable[[], Any] | None = None, @@ -80,14 +80,14 @@ if sys.version_info >= (3, 11): stdout: int | IO[Any] | None = None, stderr: int | IO[Any] | None = None, limit: int = 65536, - # These parameters are forced to these values by BaseEventLoop.subprocess_shell + # These parameters are forced to these values by BaseEventLoop.subprocess_exec universal_newlines: Literal[False] = False, - shell: Literal[True] = True, + shell: Literal[False] = False, bufsize: Literal[0] = 0, encoding: None = None, errors: None = None, + text: Literal[False] | None = None, # These parameters are taken by subprocess.Popen, which this ultimately delegates to - text: bool | None = None, executable: StrOrBytesPath | None = None, preexec_fn: Callable[[], Any] | None = None, close_fds: bool = True, @@ -120,7 +120,7 @@ elif sys.version_info >= (3, 10): bufsize: Literal[0] = 0, encoding: None = None, errors: None = None, - text: Literal[False, None] = None, + text: Literal[False] | None = None, # These parameters are taken by subprocess.Popen, which this ultimately delegates to executable: StrOrBytesPath | None = None, preexec_fn: Callable[[], Any] | None = None, @@ -145,14 +145,14 @@ elif sys.version_info >= (3, 10): stdout: int | IO[Any] | None = None, stderr: int | IO[Any] | None = None, limit: int = 65536, - # These parameters are forced to these values by BaseEventLoop.subprocess_shell + # These parameters are forced to these values by BaseEventLoop.subprocess_exec universal_newlines: Literal[False] = False, - shell: Literal[True] = True, + shell: Literal[False] = False, bufsize: Literal[0] = 0, encoding: None = None, errors: None = None, + text: Literal[False] | None = None, # These parameters are taken by subprocess.Popen, which this ultimately delegates to - text: bool | None = None, executable: StrOrBytesPath | None = None, preexec_fn: Callable[[], Any] | None = None, close_fds: bool = True, @@ -185,7 +185,7 @@ else: # >= 3.9 bufsize: Literal[0] = 0, encoding: None = None, errors: None = None, - text: Literal[False, None] = None, + text: Literal[False] | None = None, # These parameters are taken by subprocess.Popen, which this ultimately delegates to executable: StrOrBytesPath | None = None, preexec_fn: Callable[[], Any] | None = None, @@ -210,14 +210,14 @@ else: # >= 3.9 stderr: int | IO[Any] | None = None, loop: events.AbstractEventLoop | None = None, limit: int = 65536, - # These parameters are forced to these values by BaseEventLoop.subprocess_shell + # These parameters are forced to these values by BaseEventLoop.subprocess_exec universal_newlines: Literal[False] = False, - shell: Literal[True] = True, + shell: Literal[False] = False, bufsize: Literal[0] = 0, encoding: None = None, errors: None = None, + text: Literal[False] | None = None, # These parameters are taken by subprocess.Popen, which this ultimately delegates to - text: bool | None = None, executable: StrOrBytesPath | None = None, preexec_fn: Callable[[], Any] | None = None, close_fds: bool = True, diff --git a/mypy/typeshed/stdlib/asyncio/tasks.pyi b/mypy/typeshed/stdlib/asyncio/tasks.pyi index 366ac7fa35e3..7c76abaf1dca 100644 --- a/mypy/typeshed/stdlib/asyncio/tasks.pyi +++ b/mypy/typeshed/stdlib/asyncio/tasks.pyi @@ -2,7 +2,7 @@ import concurrent.futures import sys from collections.abc import Awaitable, Coroutine, Generator, Iterable, Iterator from types import FrameType -from typing import Any, Generic, Protocol, TextIO, TypeVar, overload +from typing import Any, Protocol, TextIO, TypeVar, overload from typing_extensions import Literal, TypeAlias from . import _CoroutineLike @@ -86,7 +86,7 @@ else: ) -> Iterator[Future[_T]]: ... @overload -def ensure_future(coro_or_future: _FT, *, loop: AbstractEventLoop | None = None) -> _FT: ... # type: ignore[misc] +def ensure_future(coro_or_future: _FT, *, loop: AbstractEventLoop | None = None) -> _FT: ... # type: ignore[overload-overlap] @overload def ensure_future(coro_or_future: Awaitable[_T], *, loop: AbstractEventLoop | None = None) -> Task[_T]: ... @@ -95,17 +95,16 @@ def ensure_future(coro_or_future: Awaitable[_T], *, loop: AbstractEventLoop | No # zip() because typing does not support variadic type variables. See # typing PR #1550 for discussion. # -# The many type: ignores here are because the overloads overlap, -# but having overlapping overloads is the only way to get acceptable type inference in all edge cases. +# N.B. Having overlapping overloads is the only way to get acceptable type inference in all edge cases. if sys.version_info >= (3, 10): @overload - def gather(__coro_or_future1: _FutureLike[_T1], *, return_exceptions: Literal[False] = False) -> Future[tuple[_T1]]: ... # type: ignore[misc] + def gather(__coro_or_future1: _FutureLike[_T1], *, return_exceptions: Literal[False] = False) -> Future[tuple[_T1]]: ... # type: ignore[overload-overlap] @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], *, return_exceptions: Literal[False] = False ) -> Future[tuple[_T1, _T2]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -113,7 +112,7 @@ if sys.version_info >= (3, 10): return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2, _T3]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -122,7 +121,7 @@ if sys.version_info >= (3, 10): return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2, _T3, _T4]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -132,7 +131,7 @@ if sys.version_info >= (3, 10): return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2, _T3, _T4, _T5]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -143,15 +142,15 @@ if sys.version_info >= (3, 10): return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2, _T3, _T4, _T5, _T6]]: ... @overload - def gather(*coros_or_futures: _FutureLike[_T], return_exceptions: Literal[False] = False) -> Future[list[_T]]: ... # type: ignore[misc] + def gather(*coros_or_futures: _FutureLike[_T], return_exceptions: Literal[False] = False) -> Future[list[_T]]: ... # type: ignore[overload-overlap] @overload - def gather(__coro_or_future1: _FutureLike[_T1], *, return_exceptions: bool) -> Future[tuple[_T1 | BaseException]]: ... # type: ignore[misc] + def gather(__coro_or_future1: _FutureLike[_T1], *, return_exceptions: bool) -> Future[tuple[_T1 | BaseException]]: ... # type: ignore[overload-overlap] @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], *, return_exceptions: bool ) -> Future[tuple[_T1 | BaseException, _T2 | BaseException]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -159,7 +158,7 @@ if sys.version_info >= (3, 10): return_exceptions: bool, ) -> Future[tuple[_T1 | BaseException, _T2 | BaseException, _T3 | BaseException]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -168,7 +167,7 @@ if sys.version_info >= (3, 10): return_exceptions: bool, ) -> Future[tuple[_T1 | BaseException, _T2 | BaseException, _T3 | BaseException, _T4 | BaseException]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -180,7 +179,7 @@ if sys.version_info >= (3, 10): tuple[_T1 | BaseException, _T2 | BaseException, _T3 | BaseException, _T4 | BaseException, _T5 | BaseException] ]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -204,11 +203,11 @@ if sys.version_info >= (3, 10): else: @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], *, loop: AbstractEventLoop | None = None, return_exceptions: Literal[False] = False ) -> Future[tuple[_T1]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], *, @@ -216,7 +215,7 @@ else: return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -225,7 +224,7 @@ else: return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2, _T3]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -235,7 +234,7 @@ else: return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2, _T3, _T4]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -246,7 +245,7 @@ else: return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2, _T3, _T4, _T5]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -258,15 +257,15 @@ else: return_exceptions: Literal[False] = False, ) -> Future[tuple[_T1, _T2, _T3, _T4, _T5, _T6]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] *coros_or_futures: _FutureLike[_T], loop: AbstractEventLoop | None = None, return_exceptions: Literal[False] = False ) -> Future[list[_T]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], *, loop: AbstractEventLoop | None = None, return_exceptions: bool ) -> Future[tuple[_T1 | BaseException]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], *, @@ -274,7 +273,7 @@ else: return_exceptions: bool, ) -> Future[tuple[_T1 | BaseException, _T2 | BaseException]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -283,7 +282,7 @@ else: return_exceptions: bool, ) -> Future[tuple[_T1 | BaseException, _T2 | BaseException, _T3 | BaseException]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -293,7 +292,7 @@ else: return_exceptions: bool, ) -> Future[tuple[_T1 | BaseException, _T2 | BaseException, _T3 | BaseException, _T4 | BaseException]]: ... @overload - def gather( # type: ignore[misc] + def gather( # type: ignore[overload-overlap] __coro_or_future1: _FutureLike[_T1], __coro_or_future2: _FutureLike[_T2], __coro_or_future3: _FutureLike[_T3], @@ -314,7 +313,7 @@ else: ] ]: ... @overload - def gather( # type: ignore[misc] + def gather( *coros_or_futures: _FutureLike[_T], loop: AbstractEventLoop | None = None, return_exceptions: bool ) -> Future[list[_T | BaseException]]: ... @@ -338,7 +337,9 @@ else: if sys.version_info >= (3, 11): @overload - async def wait(fs: Iterable[_FT], *, timeout: float | None = None, return_when: str = "ALL_COMPLETED") -> tuple[set[_FT], set[_FT]]: ... # type: ignore[misc] + async def wait( + fs: Iterable[_FT], *, timeout: float | None = None, return_when: str = "ALL_COMPLETED" + ) -> tuple[set[_FT], set[_FT]]: ... @overload async def wait( fs: Iterable[Task[_T]], *, timeout: float | None = None, return_when: str = "ALL_COMPLETED" @@ -346,7 +347,9 @@ if sys.version_info >= (3, 11): elif sys.version_info >= (3, 10): @overload - async def wait(fs: Iterable[_FT], *, timeout: float | None = None, return_when: str = "ALL_COMPLETED") -> tuple[set[_FT], set[_FT]]: ... # type: ignore[misc] + async def wait( # type: ignore[overload-overlap] + fs: Iterable[_FT], *, timeout: float | None = None, return_when: str = "ALL_COMPLETED" + ) -> tuple[set[_FT], set[_FT]]: ... @overload async def wait( fs: Iterable[Awaitable[_T]], *, timeout: float | None = None, return_when: str = "ALL_COMPLETED" @@ -354,7 +357,7 @@ elif sys.version_info >= (3, 10): else: @overload - async def wait( # type: ignore[misc] + async def wait( # type: ignore[overload-overlap] fs: Iterable[_FT], *, loop: AbstractEventLoop | None = None, @@ -379,7 +382,7 @@ else: # While this is true in general, here it's sort-of okay to have a covariant subclass, # since the only reason why `asyncio.Future` is invariant is the `set_result()` method, # and `asyncio.Task.set_result()` always raises. -class Task(Future[_T_co], Generic[_T_co]): # type: ignore[type-var] # pyright: ignore[reportGeneralTypeIssues] +class Task(Future[_T_co]): # type: ignore[type-var] # pyright: ignore[reportGeneralTypeIssues] if sys.version_info >= (3, 12): def __init__( self, diff --git a/mypy/typeshed/stdlib/asyncio/unix_events.pyi b/mypy/typeshed/stdlib/asyncio/unix_events.pyi index e28d64b5287b..d440206aa0b9 100644 --- a/mypy/typeshed/stdlib/asyncio/unix_events.pyi +++ b/mypy/typeshed/stdlib/asyncio/unix_events.pyi @@ -3,7 +3,7 @@ import types from abc import ABCMeta, abstractmethod from collections.abc import Callable from typing import Any -from typing_extensions import Literal, Self +from typing_extensions import Literal, Self, deprecated from .events import AbstractEventLoop, BaseDefaultEventLoopPolicy from .selector_events import BaseSelectorEventLoop @@ -11,22 +11,46 @@ from .selector_events import BaseSelectorEventLoop # This is also technically not available on Win, # but other parts of typeshed need this definition. # So, it is special cased. -class AbstractChildWatcher: - @abstractmethod - def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... - @abstractmethod - def remove_child_handler(self, pid: int) -> bool: ... - @abstractmethod - def attach_loop(self, loop: AbstractEventLoop | None) -> None: ... - @abstractmethod - def close(self) -> None: ... - @abstractmethod - def __enter__(self) -> Self: ... - @abstractmethod - def __exit__(self, typ: type[BaseException] | None, exc: BaseException | None, tb: types.TracebackType | None) -> None: ... - if sys.version_info >= (3, 8): +if sys.version_info >= (3, 12): + @deprecated("Deprecated as of Python 3.12; will be removed in Python 3.14") + class AbstractChildWatcher: + @abstractmethod + def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... + @abstractmethod + def remove_child_handler(self, pid: int) -> bool: ... + @abstractmethod + def attach_loop(self, loop: AbstractEventLoop | None) -> None: ... + @abstractmethod + def close(self) -> None: ... + @abstractmethod + def __enter__(self) -> Self: ... @abstractmethod - def is_active(self) -> bool: ... + def __exit__( + self, typ: type[BaseException] | None, exc: BaseException | None, tb: types.TracebackType | None + ) -> None: ... + if sys.version_info >= (3, 8): + @abstractmethod + def is_active(self) -> bool: ... + +else: + class AbstractChildWatcher: + @abstractmethod + def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... + @abstractmethod + def remove_child_handler(self, pid: int) -> bool: ... + @abstractmethod + def attach_loop(self, loop: AbstractEventLoop | None) -> None: ... + @abstractmethod + def close(self) -> None: ... + @abstractmethod + def __enter__(self) -> Self: ... + @abstractmethod + def __exit__( + self, typ: type[BaseException] | None, exc: BaseException | None, tb: types.TracebackType | None + ) -> None: ... + if sys.version_info >= (3, 8): + @abstractmethod + def is_active(self) -> bool: ... if sys.platform != "win32": if sys.version_info >= (3, 9): @@ -62,35 +86,61 @@ if sys.platform != "win32": def attach_loop(self, loop: AbstractEventLoop | None) -> None: ... - class SafeChildWatcher(BaseChildWatcher): - def __enter__(self) -> Self: ... - def __exit__(self, a: type[BaseException] | None, b: BaseException | None, c: types.TracebackType | None) -> None: ... - def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... - def remove_child_handler(self, pid: int) -> bool: ... + if sys.version_info >= (3, 12): + @deprecated("Deprecated as of Python 3.12; will be removed in Python 3.14") + class SafeChildWatcher(BaseChildWatcher): + def __enter__(self) -> Self: ... + def __exit__(self, a: type[BaseException] | None, b: BaseException | None, c: types.TracebackType | None) -> None: ... + def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... + def remove_child_handler(self, pid: int) -> bool: ... - class FastChildWatcher(BaseChildWatcher): - def __enter__(self) -> Self: ... - def __exit__(self, a: type[BaseException] | None, b: BaseException | None, c: types.TracebackType | None) -> None: ... - def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... - def remove_child_handler(self, pid: int) -> bool: ... + @deprecated("Deprecated as of Python 3.12; will be removed in Python 3.14") + class FastChildWatcher(BaseChildWatcher): + def __enter__(self) -> Self: ... + def __exit__(self, a: type[BaseException] | None, b: BaseException | None, c: types.TracebackType | None) -> None: ... + def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... + def remove_child_handler(self, pid: int) -> bool: ... + else: + class SafeChildWatcher(BaseChildWatcher): + def __enter__(self) -> Self: ... + def __exit__(self, a: type[BaseException] | None, b: BaseException | None, c: types.TracebackType | None) -> None: ... + def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... + def remove_child_handler(self, pid: int) -> bool: ... + + class FastChildWatcher(BaseChildWatcher): + def __enter__(self) -> Self: ... + def __exit__(self, a: type[BaseException] | None, b: BaseException | None, c: types.TracebackType | None) -> None: ... + def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... + def remove_child_handler(self, pid: int) -> bool: ... class _UnixSelectorEventLoop(BaseSelectorEventLoop): ... class _UnixDefaultEventLoopPolicy(BaseDefaultEventLoopPolicy): - def get_child_watcher(self) -> AbstractChildWatcher: ... - def set_child_watcher(self, watcher: AbstractChildWatcher | None) -> None: ... + if sys.version_info >= (3, 12): + @deprecated("Deprecated as of Python 3.12; will be removed in Python 3.14") + def get_child_watcher(self) -> AbstractChildWatcher: ... + @deprecated("Deprecated as of Python 3.12; will be removed in Python 3.14") + def set_child_watcher(self, watcher: AbstractChildWatcher | None) -> None: ... + else: + def get_child_watcher(self) -> AbstractChildWatcher: ... + def set_child_watcher(self, watcher: AbstractChildWatcher | None) -> None: ... SelectorEventLoop = _UnixSelectorEventLoop DefaultEventLoopPolicy = _UnixDefaultEventLoopPolicy - if sys.version_info >= (3, 8): - from typing import Protocol - - class _Warn(Protocol): - def __call__( - self, message: str, category: type[Warning] | None = ..., stacklevel: int = ..., source: Any | None = ... - ) -> object: ... - + if sys.version_info >= (3, 12): + @deprecated("Deprecated as of Python 3.12; will be removed in Python 3.14") + class MultiLoopChildWatcher(AbstractChildWatcher): + def is_active(self) -> bool: ... + def close(self) -> None: ... + def __enter__(self) -> Self: ... + def __exit__( + self, exc_type: type[BaseException] | None, exc_val: BaseException | None, exc_tb: types.TracebackType | None + ) -> None: ... + def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... + def remove_child_handler(self, pid: int) -> bool: ... + def attach_loop(self, loop: AbstractEventLoop | None) -> None: ... + elif sys.version_info >= (3, 8): class MultiLoopChildWatcher(AbstractChildWatcher): def is_active(self) -> bool: ... def close(self) -> None: ... @@ -102,6 +152,7 @@ if sys.platform != "win32": def remove_child_handler(self, pid: int) -> bool: ... def attach_loop(self, loop: AbstractEventLoop | None) -> None: ... + if sys.version_info >= (3, 8): class ThreadedChildWatcher(AbstractChildWatcher): def is_active(self) -> Literal[True]: ... def close(self) -> None: ... @@ -109,7 +160,7 @@ if sys.platform != "win32": def __exit__( self, exc_type: type[BaseException] | None, exc_val: BaseException | None, exc_tb: types.TracebackType | None ) -> None: ... - def __del__(self, _warn: _Warn = ...) -> None: ... + def __del__(self) -> None: ... def add_child_handler(self, pid: int, callback: Callable[..., object], *args: Any) -> None: ... def remove_child_handler(self, pid: int) -> bool: ... def attach_loop(self, loop: AbstractEventLoop | None) -> None: ... diff --git a/mypy/typeshed/stdlib/asyncio/windows_utils.pyi b/mypy/typeshed/stdlib/asyncio/windows_utils.pyi index 9f88718b7b70..ed5d8da275c5 100644 --- a/mypy/typeshed/stdlib/asyncio/windows_utils.pyi +++ b/mypy/typeshed/stdlib/asyncio/windows_utils.pyi @@ -2,16 +2,12 @@ import subprocess import sys from collections.abc import Callable from types import TracebackType -from typing import Any, AnyStr, Protocol +from typing import Any, AnyStr from typing_extensions import Literal, Self if sys.platform == "win32": __all__ = ("pipe", "Popen", "PIPE", "PipeHandle") - class _WarnFunction(Protocol): - def __call__( - self, message: str, category: type[Warning] = ..., stacklevel: int = ..., source: PipeHandle = ... - ) -> object: ... BUFSIZE: Literal[8192] PIPE = subprocess.PIPE STDOUT = subprocess.STDOUT @@ -19,11 +15,7 @@ if sys.platform == "win32": class PipeHandle: def __init__(self, handle: int) -> None: ... - if sys.version_info >= (3, 8): - def __del__(self, _warn: _WarnFunction = ...) -> None: ... - else: - def __del__(self) -> None: ... - + def __del__(self) -> None: ... def __enter__(self) -> Self: ... def __exit__(self, t: type[BaseException] | None, v: BaseException | None, tb: TracebackType | None) -> None: ... @property diff --git a/mypy/typeshed/stdlib/builtins.pyi b/mypy/typeshed/stdlib/builtins.pyi index dedd72933028..e3d7ee7e5cc1 100644 --- a/mypy/typeshed/stdlib/builtins.pyi +++ b/mypy/typeshed/stdlib/builtins.pyi @@ -18,6 +18,7 @@ from _typeshed import ( SupportsAiter, SupportsAnext, SupportsDivMod, + SupportsFlush, SupportsIter, SupportsKeysAndGetItem, SupportsLenAndGetItem, @@ -62,6 +63,7 @@ from typing_extensions import ( TypeAlias, TypeGuard, TypeVarTuple, + deprecated, final, ) @@ -159,8 +161,9 @@ class classmethod(Generic[_T, _P, _R_co]): def __wrapped__(self) -> Callable[Concatenate[type[_T], _P], _R_co]: ... class type: + # object.__base__ is None. Otherwise, it would be a type. @property - def __base__(self) -> type: ... + def __base__(self) -> type | None: ... __bases__: tuple[type, ...] @property def __basicsize__(self) -> int: ... @@ -287,7 +290,7 @@ class int: def __pow__(self, __value: _PositiveInteger, __mod: None = None) -> int: ... @overload def __pow__(self, __value: _NegativeInteger, __mod: None = None) -> float: ... - # positive x -> int; negative x -> float + # positive __value -> int; negative __value -> float # return type must be Any as `int | float` causes too many false-positive errors @overload def __pow__(self, __value: int, __mod: None = None) -> Any: ... @@ -346,7 +349,7 @@ class float: def __divmod__(self, __value: float) -> tuple[float, float]: ... @overload def __pow__(self, __value: int, __mod: None = None) -> float: ... - # positive x -> float; negative x -> complex + # positive __value -> float; negative __value -> complex # return type must be Any as `float | complex` causes too many false-positive errors @overload def __pow__(self, __value: float, __mod: None = None) -> Any: ... @@ -843,6 +846,8 @@ class bool(int): @overload def __rxor__(self, __value: int) -> int: ... def __getnewargs__(self) -> tuple[int]: ... + @deprecated("Will throw an error in Python 3.14. Use `not` for logical negation of bools instead.") + def __invert__(self) -> int: ... @final class slice: @@ -860,7 +865,7 @@ class slice: __hash__: ClassVar[None] # type: ignore[assignment] def indices(self, __len: SupportsIndex) -> tuple[int, int, int]: ... -class tuple(Sequence[_T_co], Generic[_T_co]): +class tuple(Sequence[_T_co]): def __new__(cls, __iterable: Iterable[_T_co] = ...) -> Self: ... def __len__(self) -> int: ... def __contains__(self, __key: object) -> bool: ... @@ -912,7 +917,7 @@ class function: # mypy uses `builtins.function.__get__` to represent methods, properties, and getset_descriptors so we type the return as Any. def __get__(self, __instance: object, __owner: type | None = None) -> Any: ... -class list(MutableSequence[_T], Generic[_T]): +class list(MutableSequence[_T]): @overload def __init__(self) -> None: ... @overload @@ -967,7 +972,7 @@ class list(MutableSequence[_T], Generic[_T]): if sys.version_info >= (3, 9): def __class_getitem__(cls, __item: Any) -> GenericAlias: ... -class dict(MutableMapping[_KT, _VT], Generic[_KT, _VT]): +class dict(MutableMapping[_KT, _VT]): # __init__ should be kept roughly in line with `collections.UserDict.__init__`, which has similar semantics # Also multiprocessing.managers.SyncManager.dict() @overload @@ -1040,7 +1045,7 @@ class dict(MutableMapping[_KT, _VT], Generic[_KT, _VT]): @overload def __ior__(self, __value: Iterable[tuple[_KT, _VT]]) -> Self: ... -class set(MutableSet[_T], Generic[_T]): +class set(MutableSet[_T]): @overload def __init__(self) -> None: ... @overload @@ -1080,7 +1085,7 @@ class set(MutableSet[_T], Generic[_T]): if sys.version_info >= (3, 9): def __class_getitem__(cls, __item: Any) -> GenericAlias: ... -class frozenset(AbstractSet[_T_co], Generic[_T_co]): +class frozenset(AbstractSet[_T_co]): @overload def __new__(cls) -> Self: ... @overload @@ -1109,7 +1114,7 @@ class frozenset(AbstractSet[_T_co], Generic[_T_co]): if sys.version_info >= (3, 9): def __class_getitem__(cls, __item: Any) -> GenericAlias: ... -class enumerate(Iterator[tuple[int, _T]], Generic[_T]): +class enumerate(Iterator[tuple[int, _T]]): def __new__(cls, iterable: Iterable[_T], start: int = ...) -> Self: ... def __iter__(self) -> Self: ... def __next__(self) -> tuple[int, _T]: ... @@ -1194,7 +1199,7 @@ if sys.version_info >= (3, 10): # See discussion in #7491 and pure-Python implementation of `anext` at https://github.com/python/cpython/blob/ea786a882b9ed4261eafabad6011bc7ef3b5bf94/Lib/test/test_asyncgen.py#L52-L80 def anext(__i: _SupportsSynchronousAnext[_AwaitableT]) -> _AwaitableT: ... @overload - async def anext(__i: SupportsAnext[_T], default: _VT) -> _T | _VT: ... + async def anext(__i: SupportsAnext[_T], __default: _VT) -> _T | _VT: ... # compile() returns a CodeType, unless the flags argument includes PyCF_ONLY_AST (=1024), # in which case it returns ast.AST. We have overloads for flag 0 (the default) and for @@ -1318,7 +1323,7 @@ else: def exit(code: sys._ExitCode = None) -> NoReturn: ... -class filter(Iterator[_T], Generic[_T]): +class filter(Iterator[_T]): @overload def __new__(cls, __function: None, __iterable: Iterable[_T | None]) -> Self: ... @overload @@ -1340,9 +1345,9 @@ def getattr(__o: object, __name: str, __default: None) -> Any | None: ... @overload def getattr(__o: object, __name: str, __default: bool) -> Any | bool: ... @overload -def getattr(__o: object, name: str, __default: list[Any]) -> Any | list[Any]: ... +def getattr(__o: object, __name: str, __default: list[Any]) -> Any | list[Any]: ... @overload -def getattr(__o: object, name: str, __default: dict[Any, Any]) -> Any | dict[Any, Any]: ... +def getattr(__o: object, __name: str, __default: dict[Any, Any]) -> Any | dict[Any, Any]: ... @overload def getattr(__o: object, __name: str, __default: _T) -> Any | _T: ... def globals() -> dict[str, Any]: ... @@ -1357,13 +1362,13 @@ class _GetItemIterable(Protocol[_T_co]): def __getitem__(self, __i: int) -> _T_co: ... @overload -def iter(__iterable: SupportsIter[_SupportsNextT]) -> _SupportsNextT: ... +def iter(__object: SupportsIter[_SupportsNextT]) -> _SupportsNextT: ... @overload -def iter(__iterable: _GetItemIterable[_T]) -> Iterator[_T]: ... +def iter(__object: _GetItemIterable[_T]) -> Iterator[_T]: ... @overload -def iter(__function: Callable[[], _T | None], __sentinel: None) -> Iterator[_T]: ... +def iter(__object: Callable[[], _T | None], __sentinel: None) -> Iterator[_T]: ... @overload -def iter(__function: Callable[[], _T], __sentinel: object) -> Iterator[_T]: ... +def iter(__object: Callable[[], _T], __sentinel: object) -> Iterator[_T]: ... # Keep this alias in sync with unittest.case._ClassInfo if sys.version_info >= (3, 10): @@ -1377,7 +1382,7 @@ def len(__obj: Sized) -> int: ... def license() -> None: ... def locals() -> dict[str, Any]: ... -class map(Iterator[_S], Generic[_S]): +class map(Iterator[_S]): @overload def __new__(cls, __func: Callable[[_T1], _S], __iter1: Iterable[_T1]) -> Self: ... @overload @@ -1544,8 +1549,7 @@ def open( ) -> IO[Any]: ... def ord(__c: str | bytes | bytearray) -> int: ... -class _SupportsWriteAndFlush(SupportsWrite[_T_contra], Protocol[_T_contra]): - def flush(self) -> None: ... +class _SupportsWriteAndFlush(SupportsWrite[_T_contra], SupportsFlush, Protocol[_T_contra]): ... @overload def print( @@ -1649,7 +1653,7 @@ else: def quit(code: sys._ExitCode = None) -> NoReturn: ... -class reversed(Iterator[_T], Generic[_T]): +class reversed(Iterator[_T]): @overload def __init__(self, __sequence: Reversible[_T]) -> None: ... @overload @@ -1698,11 +1702,11 @@ _SupportsSumNoDefaultT = TypeVar("_SupportsSumNoDefaultT", bound=_SupportsSumWit # Instead, we special-case the most common examples of this: bool and literal integers. if sys.version_info >= (3, 8): @overload - def sum(__iterable: Iterable[bool], start: int = 0) -> int: ... # type: ignore[misc] + def sum(__iterable: Iterable[bool], start: int = 0) -> int: ... # type: ignore[overload-overlap] else: @overload - def sum(__iterable: Iterable[bool], __start: int = 0) -> int: ... # type: ignore[misc] + def sum(__iterable: Iterable[bool], __start: int = 0) -> int: ... # type: ignore[overload-overlap] @overload def sum(__iterable: Iterable[_SupportsSumNoDefaultT]) -> _SupportsSumNoDefaultT | Literal[0]: ... @@ -1719,11 +1723,11 @@ else: # (A "SupportsDunderDict" protocol doesn't work) # Use a type: ignore to make complaints about overlapping overloads go away @overload -def vars(__object: type) -> types.MappingProxyType[str, Any]: ... # type: ignore[misc] +def vars(__object: type) -> types.MappingProxyType[str, Any]: ... # type: ignore[overload-overlap] @overload def vars(__object: Any = ...) -> dict[str, Any]: ... -class zip(Iterator[_T_co], Generic[_T_co]): +class zip(Iterator[_T_co]): if sys.version_info >= (3, 10): @overload def __new__(cls, *, strict: bool = ...) -> zip[Any]: ... diff --git a/mypy/typeshed/stdlib/cgi.pyi b/mypy/typeshed/stdlib/cgi.pyi index a2acfa92d463..21bf8ca25394 100644 --- a/mypy/typeshed/stdlib/cgi.pyi +++ b/mypy/typeshed/stdlib/cgi.pyi @@ -117,7 +117,8 @@ class FieldStorage: def __contains__(self, key: str) -> bool: ... def __len__(self) -> int: ... def __bool__(self) -> bool: ... - # In Python 3 it returns bytes or str IO depending on an internal flag + def __del__(self) -> None: ... + # Returns bytes or str IO depending on an internal flag def make_file(self) -> IO[Any]: ... def print_exception( diff --git a/mypy/typeshed/stdlib/cmath.pyi b/mypy/typeshed/stdlib/cmath.pyi index 0a85600e99b7..658cfb2d40ed 100644 --- a/mypy/typeshed/stdlib/cmath.pyi +++ b/mypy/typeshed/stdlib/cmath.pyi @@ -30,7 +30,7 @@ def exp(__z: _C) -> complex: ... def isclose(a: _C, b: _C, *, rel_tol: SupportsFloat = 1e-09, abs_tol: SupportsFloat = 0.0) -> bool: ... def isinf(__z: _C) -> bool: ... def isnan(__z: _C) -> bool: ... -def log(__x: _C, __y_obj: _C = ...) -> complex: ... +def log(__x: _C, __base: _C = ...) -> complex: ... def log10(__z: _C) -> complex: ... def phase(__z: _C) -> float: ... def polar(__z: _C) -> tuple[float, float]: ... diff --git a/mypy/typeshed/stdlib/codecs.pyi b/mypy/typeshed/stdlib/codecs.pyi index f8c92392a599..985a52702bc8 100644 --- a/mypy/typeshed/stdlib/codecs.pyi +++ b/mypy/typeshed/stdlib/codecs.pyi @@ -213,7 +213,7 @@ class StreamWriter(Codec): def reset(self) -> None: ... def __enter__(self) -> Self: ... def __exit__(self, type: type[BaseException] | None, value: BaseException | None, tb: types.TracebackType | None) -> None: ... - def __getattr__(self, name: str, getattr: Callable[[str], Any] = ...) -> Any: ... + def __getattr__(self, name: str, getattr: Callable[[Any, str], Any] = ...) -> Any: ... class StreamReader(Codec): stream: _ReadableStream @@ -227,7 +227,7 @@ class StreamReader(Codec): def __exit__(self, type: type[BaseException] | None, value: BaseException | None, tb: types.TracebackType | None) -> None: ... def __iter__(self) -> Self: ... def __next__(self) -> str: ... - def __getattr__(self, name: str, getattr: Callable[[str], Any] = ...) -> Any: ... + def __getattr__(self, name: str, getattr: Callable[[Any, str], Any] = ...) -> Any: ... # Doesn't actually inherit from TextIO, but wraps a BinaryIO to provide text reading and writing # and delegates attributes to the underlying binary stream with __getattr__. diff --git a/mypy/typeshed/stdlib/collections/__init__.pyi b/mypy/typeshed/stdlib/collections/__init__.pyi index 1d560117a54f..955681c6ac0c 100644 --- a/mypy/typeshed/stdlib/collections/__init__.pyi +++ b/mypy/typeshed/stdlib/collections/__init__.pyi @@ -45,7 +45,7 @@ def namedtuple( defaults: Iterable[Any] | None = None, ) -> type[tuple[Any, ...]]: ... -class UserDict(MutableMapping[_KT, _VT], Generic[_KT, _VT]): +class UserDict(MutableMapping[_KT, _VT]): data: dict[_KT, _VT] # __init__ should be kept roughly in line with `dict.__init__`, which has the same semantics @overload @@ -87,9 +87,9 @@ class UserDict(MutableMapping[_KT, _VT], Generic[_KT, _VT]): def __or__(self, other: UserDict[_KT, _VT] | dict[_KT, _VT]) -> Self: ... @overload def __or__(self, other: UserDict[_T1, _T2] | dict[_T1, _T2]) -> UserDict[_KT | _T1, _VT | _T2]: ... - @overload # type: ignore[misc] + @overload def __ror__(self, other: UserDict[_KT, _VT] | dict[_KT, _VT]) -> Self: ... - @overload # type: ignore[misc] + @overload def __ror__(self, other: UserDict[_T1, _T2] | dict[_T1, _T2]) -> UserDict[_KT | _T1, _VT | _T2]: ... # UserDict.__ior__ should be kept roughly in line with MutableMapping.update() @overload # type: ignore[misc] @@ -228,7 +228,7 @@ class UserString(Sequence[UserString]): def upper(self) -> Self: ... def zfill(self, width: int) -> Self: ... -class deque(MutableSequence[_T], Generic[_T]): +class deque(MutableSequence[_T]): @property def maxlen(self) -> int | None: ... @overload @@ -372,6 +372,13 @@ class OrderedDict(dict[_KT, _VT], Reversible[_KT], Generic[_KT, _VT]): def setdefault(self: OrderedDict[_KT, _T | None], key: _KT, default: None = None) -> _T | None: ... @overload def setdefault(self, key: _KT, default: _VT) -> _VT: ... + # Same as dict.pop, but accepts keyword arguments + @overload + def pop(self, key: _KT) -> _VT: ... + @overload + def pop(self, key: _KT, default: _VT) -> _VT: ... + @overload + def pop(self, key: _KT, default: _T) -> _VT | _T: ... def __eq__(self, __value: object) -> bool: ... if sys.version_info >= (3, 9): @overload @@ -383,7 +390,7 @@ class OrderedDict(dict[_KT, _VT], Reversible[_KT], Generic[_KT, _VT]): @overload def __ror__(self, __value: dict[_T1, _T2]) -> OrderedDict[_KT | _T1, _VT | _T2]: ... # type: ignore[misc] -class defaultdict(dict[_KT, _VT], Generic[_KT, _VT]): +class defaultdict(dict[_KT, _VT]): default_factory: Callable[[], _VT] | None @overload def __init__(self) -> None: ... @@ -424,7 +431,7 @@ class defaultdict(dict[_KT, _VT], Generic[_KT, _VT]): @overload def __ror__(self, __value: dict[_T1, _T2]) -> defaultdict[_KT | _T1, _VT | _T2]: ... # type: ignore[misc] -class ChainMap(MutableMapping[_KT, _VT], Generic[_KT, _VT]): +class ChainMap(MutableMapping[_KT, _VT]): maps: list[MutableMapping[_KT, _VT]] def __init__(self, *maps: MutableMapping[_KT, _VT]) -> None: ... def new_child(self, m: MutableMapping[_KT, _VT] | None = None) -> Self: ... diff --git a/mypy/typeshed/stdlib/compileall.pyi b/mypy/typeshed/stdlib/compileall.pyi index 7520c2f5b676..7f101bf79f6d 100644 --- a/mypy/typeshed/stdlib/compileall.pyi +++ b/mypy/typeshed/stdlib/compileall.pyi @@ -6,7 +6,7 @@ from typing import Any, Protocol __all__ = ["compile_dir", "compile_file", "compile_path"] class _SupportsSearch(Protocol): - def search(self, string: str) -> Any: ... + def search(self, __string: str) -> Any: ... if sys.version_info >= (3, 10): def compile_dir( diff --git a/mypy/typeshed/stdlib/contextlib.pyi b/mypy/typeshed/stdlib/contextlib.pyi index dc2101dc01f7..ce46d0d39830 100644 --- a/mypy/typeshed/stdlib/contextlib.pyi +++ b/mypy/typeshed/stdlib/contextlib.pyi @@ -56,7 +56,7 @@ class AbstractAsyncContextManager(Protocol[_T_co]): class ContextDecorator: def __call__(self, func: _F) -> _F: ... -class _GeneratorContextManager(AbstractContextManager[_T_co], ContextDecorator, Generic[_T_co]): +class _GeneratorContextManager(AbstractContextManager[_T_co], ContextDecorator): # __init__ and all instance attributes are actually inherited from _GeneratorContextManagerBase # _GeneratorContextManagerBase is more trouble than it's worth to include in the stub; see #6676 def __init__(self, func: Callable[..., Iterator[_T_co]], args: tuple[Any, ...], kwds: dict[str, Any]) -> None: ... @@ -81,7 +81,7 @@ if sys.version_info >= (3, 10): class AsyncContextDecorator: def __call__(self, func: _AF) -> _AF: ... - class _AsyncGeneratorContextManager(AbstractAsyncContextManager[_T_co], AsyncContextDecorator, Generic[_T_co]): + class _AsyncGeneratorContextManager(AbstractAsyncContextManager[_T_co], AsyncContextDecorator): # __init__ and these attributes are actually defined in the base class _GeneratorContextManagerBase, # which is more trouble than it's worth to include in the stub (see #6676) def __init__(self, func: Callable[..., AsyncIterator[_T_co]], args: tuple[Any, ...], kwds: dict[str, Any]) -> None: ... @@ -94,7 +94,7 @@ if sys.version_info >= (3, 10): ) -> bool | None: ... else: - class _AsyncGeneratorContextManager(AbstractAsyncContextManager[_T_co], Generic[_T_co]): + class _AsyncGeneratorContextManager(AbstractAsyncContextManager[_T_co]): def __init__(self, func: Callable[..., AsyncIterator[_T_co]], args: tuple[Any, ...], kwds: dict[str, Any]) -> None: ... gen: AsyncGenerator[_T_co, Any] func: Callable[..., AsyncGenerator[_T_co, Any]] diff --git a/mypy/typeshed/stdlib/contextvars.pyi b/mypy/typeshed/stdlib/contextvars.pyi index 63b5f80aea6c..825c018d580f 100644 --- a/mypy/typeshed/stdlib/contextvars.pyi +++ b/mypy/typeshed/stdlib/contextvars.pyi @@ -23,17 +23,10 @@ class ContextVar(Generic[_T]): def name(self) -> str: ... @overload def get(self) -> _T: ... - if sys.version_info >= (3, 8): - @overload - def get(self, default: _T) -> _T: ... - @overload - def get(self, default: _D) -> _D | _T: ... - else: - @overload - def get(self, __default: _T) -> _T: ... - @overload - def get(self, __default: _D) -> _D | _T: ... - + @overload + def get(self, __default: _T) -> _T: ... + @overload + def get(self, __default: _D) -> _D | _T: ... def set(self, __value: _T) -> Token[_T]: ... def reset(self, __token: Token[_T]) -> None: ... if sys.version_info >= (3, 9): @@ -57,7 +50,7 @@ def copy_context() -> Context: ... class Context(Mapping[ContextVar[Any], Any]): def __init__(self) -> None: ... @overload - def get(self, __key: ContextVar[_T], __default: None = None) -> _T | None: ... # type: ignore[misc] # overlapping overloads + def get(self, __key: ContextVar[_T], __default: None = None) -> _T | None: ... @overload def get(self, __key: ContextVar[_T], __default: _T) -> _T: ... @overload diff --git a/mypy/typeshed/stdlib/csv.pyi b/mypy/typeshed/stdlib/csv.pyi index 53425fbcccb1..f48d9d2ff263 100644 --- a/mypy/typeshed/stdlib/csv.pyi +++ b/mypy/typeshed/stdlib/csv.pyi @@ -23,7 +23,7 @@ from _csv import ( ) if sys.version_info >= (3, 12): - from _csv import QUOTE_STRINGS as QUOTE_STRINGS, QUOTE_NOTNULL as QUOTE_NOTNULL + from _csv import QUOTE_NOTNULL as QUOTE_NOTNULL, QUOTE_STRINGS as QUOTE_STRINGS from _typeshed import SupportsWrite from collections.abc import Collection, Iterable, Iterator, Mapping, Sequence from typing import Any, Generic, TypeVar, overload diff --git a/mypy/typeshed/stdlib/dbm/__init__.pyi b/mypy/typeshed/stdlib/dbm/__init__.pyi index 0068d67b6ad1..d7115528868b 100644 --- a/mypy/typeshed/stdlib/dbm/__init__.pyi +++ b/mypy/typeshed/stdlib/dbm/__init__.pyi @@ -90,5 +90,5 @@ class _error(Exception): ... error: tuple[type[_error], type[OSError]] -def whichdb(filename: str) -> str: ... +def whichdb(filename: str) -> str | None: ... def open(file: str, flag: _TFlags = "r", mode: int = 0o666) -> _Database: ... diff --git a/mypy/typeshed/stdlib/doctest.pyi b/mypy/typeshed/stdlib/doctest.pyi index f3c05781ad92..7e334ef0c504 100644 --- a/mypy/typeshed/stdlib/doctest.pyi +++ b/mypy/typeshed/stdlib/doctest.pyi @@ -206,8 +206,8 @@ class DocTestCase(unittest.TestCase): self, test: DocTest, optionflags: int = 0, - setUp: Callable[[DocTest], Any] | None = None, - tearDown: Callable[[DocTest], Any] | None = None, + setUp: Callable[[DocTest], object] | None = None, + tearDown: Callable[[DocTest], object] | None = None, checker: OutputChecker | None = None, ) -> None: ... def runTest(self) -> None: ... diff --git a/mypy/typeshed/stdlib/email/headerregistry.pyi b/mypy/typeshed/stdlib/email/headerregistry.pyi index e158e89818f7..94623e96f208 100644 --- a/mypy/typeshed/stdlib/email/headerregistry.pyi +++ b/mypy/typeshed/stdlib/email/headerregistry.pyi @@ -143,9 +143,9 @@ if sys.version_info >= (3, 8): class _HeaderParser(Protocol): max_count: ClassVar[Literal[1] | None] @staticmethod - def value_parser(value: str) -> TokenList: ... + def value_parser(__value: str) -> TokenList: ... @classmethod - def parse(cls, value: str, kwds: dict[str, Any]) -> None: ... + def parse(cls, __value: str, __kwds: dict[str, Any]) -> None: ... class HeaderRegistry: registry: dict[str, type[_HeaderParser]] diff --git a/mypy/typeshed/stdlib/fileinput.pyi b/mypy/typeshed/stdlib/fileinput.pyi index e9f3713b4eaf..c2fd31d1ea77 100644 --- a/mypy/typeshed/stdlib/fileinput.pyi +++ b/mypy/typeshed/stdlib/fileinput.pyi @@ -2,7 +2,7 @@ import sys from _typeshed import AnyStr_co, StrOrBytesPath from collections.abc import Callable, Iterable, Iterator from types import TracebackType -from typing import IO, Any, AnyStr, Generic, Protocol, overload +from typing import IO, Any, AnyStr, Protocol, overload from typing_extensions import Literal, Self, TypeAlias if sys.version_info >= (3, 9): @@ -158,7 +158,7 @@ def fileno() -> int: ... def isfirstline() -> bool: ... def isstdin() -> bool: ... -class FileInput(Iterator[AnyStr], Generic[AnyStr]): +class FileInput(Iterator[AnyStr]): if sys.version_info >= (3, 10): # encoding and errors are added @overload diff --git a/mypy/typeshed/stdlib/functools.pyi b/mypy/typeshed/stdlib/functools.pyi index 1b4e59b7c120..4d8c45e96103 100644 --- a/mypy/typeshed/stdlib/functools.pyi +++ b/mypy/typeshed/stdlib/functools.pyi @@ -34,9 +34,9 @@ _T = TypeVar("_T") _S = TypeVar("_S") @overload -def reduce(function: Callable[[_T, _S], _T], sequence: Iterable[_S], initial: _T) -> _T: ... +def reduce(__function: Callable[[_T, _S], _T], __sequence: Iterable[_S], __initial: _T) -> _T: ... @overload -def reduce(function: Callable[[_T, _T], _T], sequence: Iterable[_T]) -> _T: ... +def reduce(__function: Callable[[_T, _T], _T], __sequence: Iterable[_T]) -> _T: ... class _CacheInfo(NamedTuple): hits: int diff --git a/mypy/typeshed/stdlib/hashlib.pyi b/mypy/typeshed/stdlib/hashlib.pyi index 18b1ab549764..ed1321f23b9e 100644 --- a/mypy/typeshed/stdlib/hashlib.pyi +++ b/mypy/typeshed/stdlib/hashlib.pyi @@ -113,14 +113,7 @@ shake_128 = _VarLenHash shake_256 = _VarLenHash def scrypt( - password: ReadableBuffer, - *, - salt: ReadableBuffer | None = None, - n: int | None = None, - r: int | None = None, - p: int | None = None, - maxmem: int = 0, - dklen: int = 64, + password: ReadableBuffer, *, salt: ReadableBuffer, n: int, r: int, p: int, maxmem: int = 0, dklen: int = 64 ) -> bytes: ... @final class _BlakeHash(_Hash): diff --git a/mypy/typeshed/stdlib/http/client.pyi b/mypy/typeshed/stdlib/http/client.pyi index 3e5e496ab501..20223695a1a8 100644 --- a/mypy/typeshed/stdlib/http/client.pyi +++ b/mypy/typeshed/stdlib/http/client.pyi @@ -176,7 +176,7 @@ class HTTPConnection: def connect(self) -> None: ... def close(self) -> None: ... def putrequest(self, method: str, url: str, skip_host: bool = False, skip_accept_encoding: bool = False) -> None: ... - def putheader(self, header: str, *argument: str) -> None: ... + def putheader(self, header: str | bytes, *argument: str | bytes) -> None: ... def endheaders(self, message_body: _DataType | None = None, *, encode_chunked: bool = False) -> None: ... def send(self, data: _DataType | str) -> None: ... @@ -187,7 +187,7 @@ class HTTPSConnection(HTTPConnection): def __init__( self, host: str, - port: str | None = None, + port: int | None = None, *, timeout: float | None = ..., source_address: tuple[str, int] | None = None, diff --git a/mypy/typeshed/stdlib/imghdr.pyi b/mypy/typeshed/stdlib/imghdr.pyi index ed3647f20fc5..d0960a5a1c5c 100644 --- a/mypy/typeshed/stdlib/imghdr.pyi +++ b/mypy/typeshed/stdlib/imghdr.pyi @@ -6,8 +6,8 @@ __all__ = ["what"] class _ReadableBinary(Protocol): def tell(self) -> int: ... - def read(self, size: int) -> bytes: ... - def seek(self, offset: int) -> Any: ... + def read(self, __size: int) -> bytes: ... + def seek(self, __offset: int) -> Any: ... @overload def what(file: StrPath | _ReadableBinary, h: None = None) -> str | None: ... diff --git a/mypy/typeshed/stdlib/imp.pyi b/mypy/typeshed/stdlib/imp.pyi index 3f2920de9c2b..b532f480fa13 100644 --- a/mypy/typeshed/stdlib/imp.pyi +++ b/mypy/typeshed/stdlib/imp.pyi @@ -45,7 +45,7 @@ class _FileLike(Protocol): def read(self) -> str | bytes: ... def close(self) -> Any: ... def __enter__(self) -> Any: ... - def __exit__(self, typ: type[BaseException] | None, exc: BaseException | None, tb: TracebackType | None) -> Any: ... + def __exit__(self, __typ: type[BaseException] | None, __exc: BaseException | None, __tb: TracebackType | None) -> Any: ... # PathLike doesn't work for the pathname argument here def load_source(name: str, pathname: str, file: _FileLike | None = None) -> types.ModuleType: ... diff --git a/mypy/typeshed/stdlib/importlib/abc.pyi b/mypy/typeshed/stdlib/importlib/abc.pyi index 28c33205a4df..148e12ec7e3f 100644 --- a/mypy/typeshed/stdlib/importlib/abc.pyi +++ b/mypy/typeshed/stdlib/importlib/abc.pyi @@ -1,20 +1,12 @@ import _ast import sys import types -from _typeshed import ( - OpenBinaryMode, - OpenBinaryModeReading, - OpenBinaryModeUpdating, - OpenBinaryModeWriting, - OpenTextMode, - ReadableBuffer, - StrPath, -) +from _typeshed import ReadableBuffer, StrPath from abc import ABCMeta, abstractmethod from collections.abc import Iterator, Mapping, Sequence from importlib.machinery import ModuleSpec -from io import BufferedRandom, BufferedReader, BufferedWriter, FileIO, TextIOWrapper -from typing import IO, Any, BinaryIO, NoReturn, Protocol, overload, runtime_checkable +from io import BufferedReader +from typing import IO, Any, Protocol, overload, runtime_checkable from typing_extensions import Literal if sys.version_info >= (3, 11): @@ -77,7 +69,7 @@ if sys.version_info >= (3, 12): def invalidate_caches(self) -> None: ... # Not defined on the actual class, but expected to exist. def find_spec( - self, fullname: str, path: Sequence[str] | None, target: types.ModuleType | None = ... + self, __fullname: str, __path: Sequence[str] | None, __target: types.ModuleType | None = ... ) -> ModuleSpec | None: ... class PathEntryFinder(metaclass=ABCMeta): @@ -92,7 +84,7 @@ else: def invalidate_caches(self) -> None: ... # Not defined on the actual class, but expected to exist. def find_spec( - self, fullname: str, path: Sequence[str] | None, target: types.ModuleType | None = ... + self, __fullname: str, __path: Sequence[str] | None, __target: types.ModuleType | None = ... ) -> ModuleSpec | None: ... class PathEntryFinder(Finder): @@ -139,72 +131,26 @@ if sys.version_info >= (3, 9): def joinpath(self, *descendants: str) -> Traversable: ... else: @abstractmethod - def joinpath(self, child: str) -> Traversable: ... - # The .open method comes from pathlib.pyi and should be kept in sync. - @overload - @abstractmethod - def open( - self, - mode: OpenTextMode = "r", - buffering: int = ..., - encoding: str | None = ..., - errors: str | None = ..., - newline: str | None = ..., - ) -> TextIOWrapper: ... - # Unbuffered binary mode: returns a FileIO - @overload - @abstractmethod - def open( - self, mode: OpenBinaryMode, buffering: Literal[0], encoding: None = None, errors: None = None, newline: None = None - ) -> FileIO: ... - # Buffering is on: return BufferedRandom, BufferedReader, or BufferedWriter - @overload - @abstractmethod - def open( - self, - mode: OpenBinaryModeUpdating, - buffering: Literal[-1, 1] = ..., - encoding: None = None, - errors: None = None, - newline: None = None, - ) -> BufferedRandom: ... - @overload - @abstractmethod - def open( - self, - mode: OpenBinaryModeWriting, - buffering: Literal[-1, 1] = ..., - encoding: None = None, - errors: None = None, - newline: None = None, - ) -> BufferedWriter: ... - @overload - @abstractmethod - def open( - self, - mode: OpenBinaryModeReading, - buffering: Literal[-1, 1] = ..., - encoding: None = None, - errors: None = None, - newline: None = None, - ) -> BufferedReader: ... - # Buffering cannot be determined: fall back to BinaryIO + def joinpath(self, __child: str) -> Traversable: ... + + # The documentation and runtime protocol allows *args, **kwargs arguments, + # but this would mean that all implementers would have to support them, + # which is not the case. @overload @abstractmethod - def open( - self, mode: OpenBinaryMode, buffering: int = ..., encoding: None = None, errors: None = None, newline: None = None - ) -> BinaryIO: ... - # Fallback if mode is not specified + def open(self, __mode: Literal["r", "w"] = "r", *, encoding: str | None = None, errors: str | None = None) -> IO[str]: ... @overload @abstractmethod - def open( - self, mode: str, buffering: int = ..., encoding: str | None = ..., errors: str | None = ..., newline: str | None = ... - ) -> IO[Any]: ... + def open(self, __mode: Literal["rb", "wb"]) -> IO[bytes]: ... @property @abstractmethod def name(self) -> str: ... - @abstractmethod - def __truediv__(self, child: str) -> Traversable: ... + if sys.version_info >= (3, 10): + def __truediv__(self, __child: str) -> Traversable: ... + else: + @abstractmethod + def __truediv__(self, __child: str) -> Traversable: ... + @abstractmethod def read_bytes(self) -> bytes: ... @abstractmethod @@ -214,6 +160,6 @@ if sys.version_info >= (3, 9): @abstractmethod def files(self) -> Traversable: ... def open_resource(self, resource: str) -> BufferedReader: ... - def resource_path(self, resource: Any) -> NoReturn: ... + def resource_path(self, resource: Any) -> str: ... def is_resource(self, path: str) -> bool: ... def contents(self) -> Iterator[str]: ... diff --git a/mypy/typeshed/stdlib/importlib/machinery.pyi b/mypy/typeshed/stdlib/importlib/machinery.pyi index 1a9680ab3c46..a0431905a828 100644 --- a/mypy/typeshed/stdlib/importlib/machinery.pyi +++ b/mypy/typeshed/stdlib/importlib/machinery.pyi @@ -2,8 +2,9 @@ import importlib.abc import sys import types from _typeshed import ReadableBuffer -from collections.abc import Callable, Iterable, Sequence +from collections.abc import Callable, Iterable, MutableSequence, Sequence from typing import Any +from typing_extensions import Literal, deprecated if sys.version_info >= (3, 8): from importlib.metadata import DistributionFinder, PathDistribution @@ -158,3 +159,23 @@ class ExtensionFileLoader(importlib.abc.ExecutionLoader): def get_code(self, fullname: str) -> None: ... def __eq__(self, other: object) -> bool: ... def __hash__(self) -> int: ... + +if sys.version_info >= (3, 11): + import importlib.readers + + class NamespaceLoader(importlib.abc.InspectLoader): + def __init__( + self, name: str, path: MutableSequence[str], path_finder: Callable[[str, tuple[str, ...]], ModuleSpec] + ) -> None: ... + def is_package(self, fullname: str) -> Literal[True]: ... + def get_source(self, fullname: str) -> Literal[""]: ... + def get_code(self, fullname: str) -> types.CodeType: ... + def create_module(self, spec: ModuleSpec) -> None: ... + def exec_module(self, module: types.ModuleType) -> None: ... + @deprecated("load_module() is deprecated; use exec_module() instead") + def load_module(self, fullname: str) -> types.ModuleType: ... + def get_resource_reader(self, module: types.ModuleType) -> importlib.readers.NamespaceReader: ... + if sys.version_info < (3, 12): + @staticmethod + @deprecated("module_repr() is deprecated, and has been removed in Python 3.12") + def module_repr(module: types.ModuleType) -> str: ... diff --git a/mypy/typeshed/stdlib/importlib/metadata/__init__.pyi b/mypy/typeshed/stdlib/importlib/metadata/__init__.pyi index e52756544e9a..fd470b8f061d 100644 --- a/mypy/typeshed/stdlib/importlib/metadata/__init__.pyi +++ b/mypy/typeshed/stdlib/importlib/metadata/__init__.pyi @@ -207,7 +207,7 @@ if sys.version_info >= (3, 12): elif sys.version_info >= (3, 10): @overload - def entry_points() -> SelectableGroups: ... # type: ignore[misc] + def entry_points() -> SelectableGroups: ... # type: ignore[overload-overlap] @overload def entry_points( *, name: str = ..., value: str = ..., group: str = ..., module: str = ..., attr: str = ..., extras: list[str] = ... diff --git a/mypy/typeshed/stdlib/importlib/readers.pyi b/mypy/typeshed/stdlib/importlib/readers.pyi new file mode 100644 index 000000000000..f34794601b59 --- /dev/null +++ b/mypy/typeshed/stdlib/importlib/readers.pyi @@ -0,0 +1,68 @@ +# On py311+, things are actually defined in importlib.resources.readers, +# and re-exported here, +# but doing it this way leads to less code duplication for us + +import pathlib +import sys +import zipfile +from _typeshed import Incomplete, StrPath +from collections.abc import Iterable, Iterator +from io import BufferedReader +from typing import NoReturn, TypeVar +from typing_extensions import Literal, Never + +if sys.version_info >= (3, 11): + import importlib.resources.abc as abc +else: + import importlib.abc as abc + +if sys.version_info >= (3, 10): + if sys.version_info >= (3, 11): + __all__ = ["FileReader", "ZipReader", "MultiplexedPath", "NamespaceReader"] + + if sys.version_info < (3, 11): + _T = TypeVar("_T") + + def remove_duplicates(items: Iterable[_T]) -> Iterator[_T]: ... + + class FileReader(abc.TraversableResources): + path: pathlib.Path + def __init__(self, loader) -> None: ... + def resource_path(self, resource: StrPath) -> str: ... + def files(self) -> pathlib.Path: ... + + class ZipReader(abc.TraversableResources): + prefix: str + archive: Incomplete + def __init__(self, loader, module: str) -> None: ... + def open_resource(self, resource: str) -> BufferedReader: ... + def is_resource(self, path: StrPath) -> bool: ... + def files(self) -> zipfile.Path: ... + + class MultiplexedPath(abc.Traversable): + def __init__(self, *paths: abc.Traversable) -> None: ... + def iterdir(self) -> Iterator[abc.Traversable]: ... + def read_bytes(self) -> NoReturn: ... + def read_text(self, *args: Never, **kwargs: Never) -> NoReturn: ... # type: ignore[override] + def is_dir(self) -> Literal[True]: ... + def is_file(self) -> Literal[False]: ... + + if sys.version_info >= (3, 12): + def joinpath(self, *descendants: str) -> abc.Traversable: ... + elif sys.version_info >= (3, 11): + def joinpath(self, child: str) -> abc.Traversable: ... # type: ignore[override] + else: + def joinpath(self, child: str) -> abc.Traversable: ... + + if sys.version_info < (3, 12): + __truediv__ = joinpath + + def open(self, *args: Never, **kwargs: Never) -> NoReturn: ... # type: ignore[override] + @property + def name(self) -> str: ... + + class NamespaceReader(abc.TraversableResources): + path: MultiplexedPath + def __init__(self, namespace_path) -> None: ... + def resource_path(self, resource: str) -> str: ... + def files(self) -> MultiplexedPath: ... diff --git a/mypy/typeshed/stdlib/importlib/resources/readers.pyi b/mypy/typeshed/stdlib/importlib/resources/readers.pyi new file mode 100644 index 000000000000..0ab21fd29114 --- /dev/null +++ b/mypy/typeshed/stdlib/importlib/resources/readers.pyi @@ -0,0 +1,14 @@ +# On py311+, things are actually defined here +# and re-exported from importlib.readers, +# but doing it this way leads to less code duplication for us + +import sys +from collections.abc import Iterable, Iterator +from typing import TypeVar + +if sys.version_info >= (3, 11): + from importlib.readers import * + + _T = TypeVar("_T") + + def remove_duplicates(items: Iterable[_T]) -> Iterator[_T]: ... diff --git a/mypy/typeshed/stdlib/importlib/resources/simple.pyi b/mypy/typeshed/stdlib/importlib/resources/simple.pyi new file mode 100644 index 000000000000..9502375d00a2 --- /dev/null +++ b/mypy/typeshed/stdlib/importlib/resources/simple.pyi @@ -0,0 +1,49 @@ +import abc +import sys +from _typeshed import Incomplete, OpenBinaryMode, OpenTextMode, Unused +from collections.abc import Iterator +from io import TextIOWrapper +from typing import IO, Any, BinaryIO, NoReturn, overload +from typing_extensions import Literal, Never + +if sys.version_info >= (3, 11): + from .abc import Traversable, TraversableResources + + class SimpleReader(abc.ABC): + @property + @abc.abstractmethod + def package(self) -> str: ... + @abc.abstractmethod + def children(self) -> list[SimpleReader]: ... + @abc.abstractmethod + def resources(self) -> list[str]: ... + @abc.abstractmethod + def open_binary(self, resource: str) -> BinaryIO: ... + @property + def name(self) -> str: ... + + class ResourceHandle(Traversable, metaclass=abc.ABCMeta): + parent: ResourceContainer + def __init__(self, parent: ResourceContainer, name: str) -> None: ... + def is_file(self) -> Literal[True]: ... + def is_dir(self) -> Literal[False]: ... + @overload + def open(self, mode: OpenTextMode = "r", *args: Incomplete, **kwargs: Incomplete) -> TextIOWrapper: ... + @overload + def open(self, mode: OpenBinaryMode, *args: Unused, **kwargs: Unused) -> BinaryIO: ... + @overload + def open(self, mode: str, *args: Incomplete, **kwargs: Incomplete) -> IO[Any]: ... + def joinpath(self, name: Never) -> NoReturn: ... # type: ignore[override] + + class ResourceContainer(Traversable, metaclass=abc.ABCMeta): + reader: SimpleReader + def __init__(self, reader: SimpleReader) -> None: ... + def is_dir(self) -> Literal[True]: ... + def is_file(self) -> Literal[False]: ... + def iterdir(self) -> Iterator[ResourceHandle | ResourceContainer]: ... + def open(self, *args: Never, **kwargs: Never) -> NoReturn: ... # type: ignore[override] + if sys.version_info < (3, 12): + def joinpath(self, *descendants: str) -> Traversable: ... + + class TraversableReader(TraversableResources, SimpleReader, metaclass=abc.ABCMeta): + def files(self) -> ResourceContainer: ... diff --git a/mypy/typeshed/stdlib/importlib/simple.pyi b/mypy/typeshed/stdlib/importlib/simple.pyi new file mode 100644 index 000000000000..58d8c6617082 --- /dev/null +++ b/mypy/typeshed/stdlib/importlib/simple.pyi @@ -0,0 +1,11 @@ +import sys + +if sys.version_info >= (3, 11): + from .resources.simple import ( + ResourceContainer as ResourceContainer, + ResourceHandle as ResourceHandle, + SimpleReader as SimpleReader, + TraversableReader as TraversableReader, + ) + + __all__ = ["SimpleReader", "ResourceHandle", "ResourceContainer", "TraversableReader"] diff --git a/mypy/typeshed/stdlib/inspect.pyi b/mypy/typeshed/stdlib/inspect.pyi index 601d23e786ac..6498719df887 100644 --- a/mypy/typeshed/stdlib/inspect.pyi +++ b/mypy/typeshed/stdlib/inspect.pyi @@ -294,6 +294,14 @@ _SourceObjectType: TypeAlias = ( def findsource(object: _SourceObjectType) -> tuple[list[str], int]: ... def getabsfile(object: _SourceObjectType, _filename: str | None = None) -> str: ... + +# Special-case the two most common input types here +# to avoid the annoyingly vague `Sequence[str]` return type +@overload +def getblock(lines: list[str]) -> list[str]: ... +@overload +def getblock(lines: tuple[str, ...]) -> tuple[str, ...]: ... +@overload def getblock(lines: Sequence[str]) -> Sequence[str]: ... def getdoc(object: object) -> str | None: ... def getcomments(object: object) -> str | None: ... diff --git a/mypy/typeshed/stdlib/io.pyi b/mypy/typeshed/stdlib/io.pyi index c114f839594f..16270b948f35 100644 --- a/mypy/typeshed/stdlib/io.pyi +++ b/mypy/typeshed/stdlib/io.pyi @@ -6,7 +6,7 @@ from _typeshed import FileDescriptorOrPath, ReadableBuffer, WriteableBuffer from collections.abc import Callable, Iterable, Iterator from os import _Opener from types import TracebackType -from typing import IO, Any, BinaryIO, TextIO +from typing import IO, Any, BinaryIO, TextIO, TypeVar, overload from typing_extensions import Literal, Self __all__ = [ @@ -33,6 +33,8 @@ __all__ = [ if sys.version_info >= (3, 8): __all__ += ["open_code"] +_T = TypeVar("_T") + DEFAULT_BUFFER_SIZE: Literal[8192] SEEK_SET: Literal[0] @@ -92,7 +94,7 @@ class BufferedIOBase(IOBase): class FileIO(RawIOBase, BinaryIO): # type: ignore[misc] # incompatible definitions of writelines in the base classes mode: str - name: FileDescriptorOrPath # type: ignore[assignment] + name: FileDescriptorOrPath def __init__( self, file: FileDescriptorOrPath, mode: str = ..., closefd: bool = ..., opener: _Opener | None = ... ) -> None: ... @@ -194,3 +196,9 @@ class IncrementalNewlineDecoder(codecs.IncrementalDecoder): @property def newlines(self) -> str | tuple[str, ...] | None: ... def setstate(self, __state: tuple[bytes, int]) -> None: ... + +if sys.version_info >= (3, 10): + @overload + def text_encoding(__encoding: None, __stacklevel: int = 2) -> Literal["locale", "utf-8"]: ... + @overload + def text_encoding(__encoding: _T, __stacklevel: int = 2) -> _T: ... diff --git a/mypy/typeshed/stdlib/ipaddress.pyi b/mypy/typeshed/stdlib/ipaddress.pyi index 945e8bcbbdee..13a8c4330a50 100644 --- a/mypy/typeshed/stdlib/ipaddress.pyi +++ b/mypy/typeshed/stdlib/ipaddress.pyi @@ -1,5 +1,5 @@ import sys -from collections.abc import Container, Iterable, Iterator +from collections.abc import Iterable, Iterator from typing import Any, Generic, SupportsInt, TypeVar, overload from typing_extensions import Literal, Self, TypeAlias @@ -70,7 +70,7 @@ class _BaseAddress(_IPAddressBase, SupportsInt): @property def packed(self) -> bytes: ... -class _BaseNetwork(_IPAddressBase, Container[_A], Iterable[_A], Generic[_A]): +class _BaseNetwork(_IPAddressBase, Generic[_A]): network_address: _A netmask: _A def __init__(self, address: object, strict: bool = ...) -> None: ... diff --git a/mypy/typeshed/stdlib/itertools.pyi b/mypy/typeshed/stdlib/itertools.pyi index 4b5d624c78d7..1bc0b2ec7390 100644 --- a/mypy/typeshed/stdlib/itertools.pyi +++ b/mypy/typeshed/stdlib/itertools.pyi @@ -10,6 +10,7 @@ _T = TypeVar("_T") _S = TypeVar("_S") _N = TypeVar("_N", int, float, SupportsFloat, SupportsInt, SupportsIndex, SupportsComplex) _T_co = TypeVar("_T_co", covariant=True) +_S_co = TypeVar("_S_co", covariant=True) _T1 = TypeVar("_T1") _T2 = TypeVar("_T2") _T3 = TypeVar("_T3") @@ -23,7 +24,7 @@ _Predicate: TypeAlias = Callable[[_T], object] # Technically count can take anything that implements a number protocol and has an add method # but we can't enforce the add method -class count(Iterator[_N], Generic[_N]): +class count(Iterator[_N]): @overload def __new__(cls) -> count[int]: ... @overload @@ -33,12 +34,12 @@ class count(Iterator[_N], Generic[_N]): def __next__(self) -> _N: ... def __iter__(self) -> Self: ... -class cycle(Iterator[_T], Generic[_T]): +class cycle(Iterator[_T]): def __init__(self, __iterable: Iterable[_T]) -> None: ... def __next__(self) -> _T: ... def __iter__(self) -> Self: ... -class repeat(Iterator[_T], Generic[_T]): +class repeat(Iterator[_T]): @overload def __init__(self, object: _T) -> None: ... @overload @@ -47,7 +48,7 @@ class repeat(Iterator[_T], Generic[_T]): def __iter__(self) -> Self: ... def __length_hint__(self) -> int: ... -class accumulate(Iterator[_T], Generic[_T]): +class accumulate(Iterator[_T]): if sys.version_info >= (3, 8): @overload def __init__(self, iterable: Iterable[_T], func: None = None, *, initial: _T | None = ...) -> None: ... @@ -59,7 +60,7 @@ class accumulate(Iterator[_T], Generic[_T]): def __iter__(self) -> Self: ... def __next__(self) -> _T: ... -class chain(Iterator[_T], Generic[_T]): +class chain(Iterator[_T]): def __init__(self, *iterables: Iterable[_T]) -> None: ... def __next__(self) -> _T: ... def __iter__(self) -> Self: ... @@ -69,30 +70,30 @@ class chain(Iterator[_T], Generic[_T]): if sys.version_info >= (3, 9): def __class_getitem__(cls, __item: Any) -> GenericAlias: ... -class compress(Iterator[_T], Generic[_T]): +class compress(Iterator[_T]): def __init__(self, data: Iterable[_T], selectors: Iterable[Any]) -> None: ... def __iter__(self) -> Self: ... def __next__(self) -> _T: ... -class dropwhile(Iterator[_T], Generic[_T]): +class dropwhile(Iterator[_T]): def __init__(self, __predicate: _Predicate[_T], __iterable: Iterable[_T]) -> None: ... def __iter__(self) -> Self: ... def __next__(self) -> _T: ... -class filterfalse(Iterator[_T], Generic[_T]): +class filterfalse(Iterator[_T]): def __init__(self, __predicate: _Predicate[_T] | None, __iterable: Iterable[_T]) -> None: ... def __iter__(self) -> Self: ... def __next__(self) -> _T: ... -class groupby(Iterator[tuple[_T, Iterator[_S]]], Generic[_T, _S]): +class groupby(Iterator[tuple[_T_co, Iterator[_S_co]]], Generic[_T_co, _S_co]): @overload def __new__(cls, iterable: Iterable[_T1], key: None = None) -> groupby[_T1, _T1]: ... @overload def __new__(cls, iterable: Iterable[_T1], key: Callable[[_T1], _T2]) -> groupby[_T2, _T1]: ... def __iter__(self) -> Self: ... - def __next__(self) -> tuple[_T, Iterator[_S]]: ... + def __next__(self) -> tuple[_T_co, Iterator[_S_co]]: ... -class islice(Iterator[_T], Generic[_T]): +class islice(Iterator[_T]): @overload def __init__(self, __iterable: Iterable[_T], __stop: int | None) -> None: ... @overload @@ -100,19 +101,19 @@ class islice(Iterator[_T], Generic[_T]): def __iter__(self) -> Self: ... def __next__(self) -> _T: ... -class starmap(Iterator[_T], Generic[_T]): - def __init__(self, __function: Callable[..., _T], __iterable: Iterable[Iterable[Any]]) -> None: ... +class starmap(Iterator[_T_co]): + def __new__(cls, __function: Callable[..., _T], __iterable: Iterable[Iterable[Any]]) -> starmap[_T]: ... def __iter__(self) -> Self: ... - def __next__(self) -> _T: ... + def __next__(self) -> _T_co: ... -class takewhile(Iterator[_T], Generic[_T]): +class takewhile(Iterator[_T]): def __init__(self, __predicate: _Predicate[_T], __iterable: Iterable[_T]) -> None: ... def __iter__(self) -> Self: ... def __next__(self) -> _T: ... def tee(__iterable: Iterable[_T], __n: int = 2) -> tuple[Iterator[_T], ...]: ... -class zip_longest(Iterator[_T_co], Generic[_T_co]): +class zip_longest(Iterator[_T_co]): # one iterable (fillvalue doesn't matter) @overload def __new__(cls, __iter1: Iterable[_T1], *, fillvalue: object = ...) -> zip_longest[tuple[_T1]]: ... @@ -192,7 +193,7 @@ class zip_longest(Iterator[_T_co], Generic[_T_co]): def __iter__(self) -> Self: ... def __next__(self) -> _T_co: ... -class product(Iterator[_T_co], Generic[_T_co]): +class product(Iterator[_T_co]): @overload def __new__(cls, __iter1: Iterable[_T1]) -> product[tuple[_T1]]: ... @overload @@ -241,12 +242,21 @@ class product(Iterator[_T_co], Generic[_T_co]): def __iter__(self) -> Self: ... def __next__(self) -> _T_co: ... -class permutations(Iterator[tuple[_T, ...]], Generic[_T]): - def __init__(self, iterable: Iterable[_T], r: int | None = ...) -> None: ... +class permutations(Iterator[_T_co]): + @overload + def __new__(cls, iterable: Iterable[_T], r: Literal[2]) -> permutations[tuple[_T, _T]]: ... + @overload + def __new__(cls, iterable: Iterable[_T], r: Literal[3]) -> permutations[tuple[_T, _T, _T]]: ... + @overload + def __new__(cls, iterable: Iterable[_T], r: Literal[4]) -> permutations[tuple[_T, _T, _T, _T]]: ... + @overload + def __new__(cls, iterable: Iterable[_T], r: Literal[5]) -> permutations[tuple[_T, _T, _T, _T, _T]]: ... + @overload + def __new__(cls, iterable: Iterable[_T], r: int | None = ...) -> permutations[tuple[_T, ...]]: ... def __iter__(self) -> Self: ... - def __next__(self) -> tuple[_T, ...]: ... + def __next__(self) -> _T_co: ... -class combinations(Iterator[_T_co], Generic[_T_co]): +class combinations(Iterator[_T_co]): @overload def __new__(cls, iterable: Iterable[_T], r: Literal[2]) -> combinations[tuple[_T, _T]]: ... @overload @@ -260,13 +270,22 @@ class combinations(Iterator[_T_co], Generic[_T_co]): def __iter__(self) -> Self: ... def __next__(self) -> _T_co: ... -class combinations_with_replacement(Iterator[tuple[_T, ...]], Generic[_T]): - def __init__(self, iterable: Iterable[_T], r: int) -> None: ... +class combinations_with_replacement(Iterator[_T_co]): + @overload + def __new__(cls, iterable: Iterable[_T], r: Literal[2]) -> combinations_with_replacement[tuple[_T, _T]]: ... + @overload + def __new__(cls, iterable: Iterable[_T], r: Literal[3]) -> combinations_with_replacement[tuple[_T, _T, _T]]: ... + @overload + def __new__(cls, iterable: Iterable[_T], r: Literal[4]) -> combinations_with_replacement[tuple[_T, _T, _T, _T]]: ... + @overload + def __new__(cls, iterable: Iterable[_T], r: Literal[5]) -> combinations_with_replacement[tuple[_T, _T, _T, _T, _T]]: ... + @overload + def __new__(cls, iterable: Iterable[_T], r: int) -> combinations_with_replacement[tuple[_T, ...]]: ... def __iter__(self) -> Self: ... - def __next__(self) -> tuple[_T, ...]: ... + def __next__(self) -> _T_co: ... if sys.version_info >= (3, 10): - class pairwise(Iterator[_T_co], Generic[_T_co]): + class pairwise(Iterator[_T_co]): def __new__(cls, __iterable: Iterable[_T]) -> pairwise[tuple[_T, _T]]: ... def __iter__(self) -> Self: ... def __next__(self) -> _T_co: ... diff --git a/mypy/typeshed/stdlib/locale.pyi b/mypy/typeshed/stdlib/locale.pyi index 2e95c659dbcd..c18523e04361 100644 --- a/mypy/typeshed/stdlib/locale.pyi +++ b/mypy/typeshed/stdlib/locale.pyi @@ -8,7 +8,6 @@ from _locale import ( LC_NUMERIC as LC_NUMERIC, LC_TIME as LC_TIME, localeconv as localeconv, - setlocale as setlocale, strcoll as strcoll, strxfrm as strxfrm, ) @@ -16,7 +15,7 @@ from _locale import ( # This module defines a function "str()", which is why "str" can't be used # as a type annotation or type alias. from builtins import str as _str -from collections.abc import Callable +from collections.abc import Callable, Iterable from decimal import Decimal from typing import Any @@ -131,6 +130,7 @@ def getdefaultlocale( envvars: tuple[_str, ...] = ("LC_ALL", "LC_CTYPE", "LANG", "LANGUAGE") ) -> tuple[_str | None, _str | None]: ... def getlocale(category: int = ...) -> tuple[_str | None, _str | None]: ... +def setlocale(category: int, locale: _str | Iterable[_str | None] | None = None) -> _str: ... def getpreferredencoding(do_setlocale: bool = True) -> _str: ... def normalize(localename: _str) -> _str: ... def resetlocale(category: int = ...) -> None: ... diff --git a/mypy/typeshed/stdlib/logging/__init__.pyi b/mypy/typeshed/stdlib/logging/__init__.pyi index db797d4180ea..5a72b1fcd799 100644 --- a/mypy/typeshed/stdlib/logging/__init__.pyi +++ b/mypy/typeshed/stdlib/logging/__init__.pyi @@ -7,7 +7,7 @@ from re import Pattern from string import Template from time import struct_time from types import FrameType, TracebackType -from typing import Any, ClassVar, Generic, TextIO, TypeVar, overload +from typing import Any, ClassVar, Generic, Protocol, TextIO, TypeVar, overload from typing_extensions import Literal, Self, TypeAlias if sys.version_info >= (3, 11): @@ -66,10 +66,20 @@ if sys.version_info >= (3, 12): _SysExcInfoType: TypeAlias = tuple[type[BaseException], BaseException, TracebackType | None] | tuple[None, None, None] _ExcInfoType: TypeAlias = None | bool | _SysExcInfoType | BaseException _ArgsType: TypeAlias = tuple[object, ...] | Mapping[str, object] -_FilterType: TypeAlias = Filter | Callable[[LogRecord], bool] _Level: TypeAlias = int | str _FormatStyle: TypeAlias = Literal["%", "{", "$"] +if sys.version_info >= (3, 12): + class _SupportsFilter(Protocol): + def filter(self, __record: LogRecord) -> bool | LogRecord: ... + + _FilterType: TypeAlias = Filter | Callable[[LogRecord], bool | LogRecord] | _SupportsFilter +else: + class _SupportsFilter(Protocol): + def filter(self, __record: LogRecord) -> bool: ... + + _FilterType: TypeAlias = Filter | Callable[[LogRecord], bool] | _SupportsFilter + raiseExceptions: bool logThreads: bool logMultiprocessing: bool diff --git a/mypy/typeshed/stdlib/logging/config.pyi b/mypy/typeshed/stdlib/logging/config.pyi index e92658f7f1b3..0a61e5b16870 100644 --- a/mypy/typeshed/stdlib/logging/config.pyi +++ b/mypy/typeshed/stdlib/logging/config.pyi @@ -5,7 +5,7 @@ from configparser import RawConfigParser from re import Pattern from threading import Thread from typing import IO, Any, overload -from typing_extensions import Literal, SupportsIndex, TypeAlias, TypedDict +from typing_extensions import Literal, Required, SupportsIndex, TypeAlias, TypedDict from . import Filter, Filterer, Formatter, Handler, Logger, _FilterType, _FormatStyle, _Level @@ -50,18 +50,16 @@ _FilterConfiguration: TypeAlias = _FilterConfigurationTypedDict | dict[str, Any] # Handler config can have additional keys even when not providing a custom factory so we just use `dict`. _HandlerConfiguration: TypeAlias = dict[str, Any] -class _OptionalDictConfigArgs(TypedDict, total=False): +class _DictConfigArgs(TypedDict, total=False): + version: Required[Literal[1]] formatters: dict[str, _FormatterConfiguration] filters: dict[str, _FilterConfiguration] handlers: dict[str, _HandlerConfiguration] loggers: dict[str, _LoggerConfiguration] - root: _RootLoggerConfiguration | None + root: _RootLoggerConfiguration incremental: bool disable_existing_loggers: bool -class _DictConfigArgs(_OptionalDictConfigArgs, TypedDict): - version: Literal[1] - # Accept dict[str, Any] to avoid false positives if called with a dict # type, since dict types are not compatible with TypedDicts. # diff --git a/mypy/typeshed/stdlib/math.pyi b/mypy/typeshed/stdlib/math.pyi index 4a4d592b860d..73b53a713301 100644 --- a/mypy/typeshed/stdlib/math.pyi +++ b/mypy/typeshed/stdlib/math.pyi @@ -125,7 +125,7 @@ def pow(__x: _SupportsFloatOrIndex, __y: _SupportsFloatOrIndex) -> float: ... if sys.version_info >= (3, 8): @overload - def prod(__iterable: Iterable[SupportsIndex], *, start: SupportsIndex = 1) -> int: ... # type: ignore[misc] + def prod(__iterable: Iterable[SupportsIndex], *, start: SupportsIndex = 1) -> int: ... # type: ignore[overload-overlap] @overload def prod(__iterable: Iterable[_SupportsFloatOrIndex], *, start: _SupportsFloatOrIndex = 1) -> float: ... diff --git a/mypy/typeshed/stdlib/multiprocessing/connection.pyi b/mypy/typeshed/stdlib/multiprocessing/connection.pyi index 28696fe6a3a3..333b8820d84d 100644 --- a/mypy/typeshed/stdlib/multiprocessing/connection.pyi +++ b/mypy/typeshed/stdlib/multiprocessing/connection.pyi @@ -31,6 +31,7 @@ class _ConnectionBase: def __exit__( self, exc_type: type[BaseException] | None, exc_value: BaseException | None, exc_tb: types.TracebackType | None ) -> None: ... + def __del__(self) -> None: ... class Connection(_ConnectionBase): ... diff --git a/mypy/typeshed/stdlib/multiprocessing/managers.pyi b/mypy/typeshed/stdlib/multiprocessing/managers.pyi index 9cfc1ebbdd5e..c0ef0a3609d0 100644 --- a/mypy/typeshed/stdlib/multiprocessing/managers.pyi +++ b/mypy/typeshed/stdlib/multiprocessing/managers.pyi @@ -216,3 +216,4 @@ if sys.version_info >= (3, 8): def get_server(self) -> SharedMemoryServer: ... def SharedMemory(self, size: int) -> _SharedMemory: ... def ShareableList(self, sequence: Iterable[_SLT] | None) -> _ShareableList[_SLT]: ... + def __del__(self) -> None: ... diff --git a/mypy/typeshed/stdlib/multiprocessing/pool.pyi b/mypy/typeshed/stdlib/multiprocessing/pool.pyi index dd4b865a3574..5ad4bfe93fe9 100644 --- a/mypy/typeshed/stdlib/multiprocessing/pool.pyi +++ b/mypy/typeshed/stdlib/multiprocessing/pool.pyi @@ -91,7 +91,7 @@ class Pool: func: Callable[[_S], _T], iterable: Iterable[_S], chunksize: int | None = None, - callback: Callable[[_T], object] | None = None, + callback: Callable[[list[_T]], object] | None = None, error_callback: Callable[[BaseException], object] | None = None, ) -> MapResult[_T]: ... def imap(self, func: Callable[[_S], _T], iterable: Iterable[_S], chunksize: int | None = 1) -> IMapIterator[_T]: ... @@ -102,7 +102,7 @@ class Pool: func: Callable[..., _T], iterable: Iterable[Iterable[Any]], chunksize: int | None = None, - callback: Callable[[_T], object] | None = None, + callback: Callable[[list[_T]], object] | None = None, error_callback: Callable[[BaseException], object] | None = None, ) -> AsyncResult[list[_T]]: ... def close(self) -> None: ... @@ -112,6 +112,7 @@ class Pool: def __exit__( self, exc_type: type[BaseException] | None, exc_val: BaseException | None, exc_tb: TracebackType | None ) -> None: ... + def __del__(self) -> None: ... class ThreadPool(Pool): def __init__( diff --git a/mypy/typeshed/stdlib/multiprocessing/shared_memory.pyi b/mypy/typeshed/stdlib/multiprocessing/shared_memory.pyi index ae6e2a0ed19f..adbe8b943de6 100644 --- a/mypy/typeshed/stdlib/multiprocessing/shared_memory.pyi +++ b/mypy/typeshed/stdlib/multiprocessing/shared_memory.pyi @@ -20,6 +20,7 @@ class SharedMemory: def size(self) -> int: ... def close(self) -> None: ... def unlink(self) -> None: ... + def __del__(self) -> None: ... class ShareableList(Generic[_SLT]): shm: SharedMemory diff --git a/mypy/typeshed/stdlib/multiprocessing/sharedctypes.pyi b/mypy/typeshed/stdlib/multiprocessing/sharedctypes.pyi index 686a45d9ae41..636d58842158 100644 --- a/mypy/typeshed/stdlib/multiprocessing/sharedctypes.pyi +++ b/mypy/typeshed/stdlib/multiprocessing/sharedctypes.pyi @@ -73,7 +73,7 @@ def synchronized(obj: ctypes.Array[_CT], lock: _LockLike | None = None, ctx: Any def synchronized(obj: _CT, lock: _LockLike | None = None, ctx: Any | None = None) -> SynchronizedBase[_CT]: ... class _AcquireFunc(Protocol): - def __call__(self, block: bool = ..., timeout: float | None = ...) -> bool: ... + def __call__(self, __block: bool = ..., __timeout: float | None = ...) -> bool: ... class SynchronizedBase(Generic[_CT]): acquire: _AcquireFunc diff --git a/mypy/typeshed/stdlib/nt.pyi b/mypy/typeshed/stdlib/nt.pyi new file mode 100644 index 000000000000..4066096f4c71 --- /dev/null +++ b/mypy/typeshed/stdlib/nt.pyi @@ -0,0 +1,111 @@ +import sys + +if sys.platform == "win32": + # Actually defined here and re-exported from os at runtime, + # but this leads to less code duplication + from os import ( + F_OK as F_OK, + O_APPEND as O_APPEND, + O_BINARY as O_BINARY, + O_CREAT as O_CREAT, + O_EXCL as O_EXCL, + O_NOINHERIT as O_NOINHERIT, + O_RANDOM as O_RANDOM, + O_RDONLY as O_RDONLY, + O_RDWR as O_RDWR, + O_SEQUENTIAL as O_SEQUENTIAL, + O_SHORT_LIVED as O_SHORT_LIVED, + O_TEMPORARY as O_TEMPORARY, + O_TEXT as O_TEXT, + O_TRUNC as O_TRUNC, + O_WRONLY as O_WRONLY, + P_DETACH as P_DETACH, + P_NOWAIT as P_NOWAIT, + P_NOWAITO as P_NOWAITO, + P_OVERLAY as P_OVERLAY, + P_WAIT as P_WAIT, + R_OK as R_OK, + TMP_MAX as TMP_MAX, + W_OK as W_OK, + X_OK as X_OK, + DirEntry as DirEntry, + abort as abort, + access as access, + chdir as chdir, + chmod as chmod, + close as close, + closerange as closerange, + cpu_count as cpu_count, + device_encoding as device_encoding, + dup as dup, + dup2 as dup2, + error as error, + execv as execv, + execve as execve, + fspath as fspath, + fstat as fstat, + fsync as fsync, + ftruncate as ftruncate, + get_handle_inheritable as get_handle_inheritable, + get_inheritable as get_inheritable, + get_terminal_size as get_terminal_size, + getcwd as getcwd, + getcwdb as getcwdb, + getlogin as getlogin, + getpid as getpid, + getppid as getppid, + isatty as isatty, + kill as kill, + link as link, + listdir as listdir, + lseek as lseek, + lstat as lstat, + mkdir as mkdir, + open as open, + pipe as pipe, + putenv as putenv, + read as read, + readlink as readlink, + remove as remove, + rename as rename, + replace as replace, + rmdir as rmdir, + scandir as scandir, + set_handle_inheritable as set_handle_inheritable, + set_inheritable as set_inheritable, + spawnv as spawnv, + spawnve as spawnve, + startfile as startfile, + stat as stat, + stat_result as stat_result, + statvfs_result as statvfs_result, + strerror as strerror, + symlink as symlink, + system as system, + terminal_size as terminal_size, + times as times, + times_result as times_result, + truncate as truncate, + umask as umask, + uname_result as uname_result, + unlink as unlink, + urandom as urandom, + utime as utime, + waitpid as waitpid, + write as write, + ) + + if sys.version_info >= (3, 9): + from os import unsetenv as unsetenv, waitstatus_to_exitcode as waitstatus_to_exitcode + if sys.version_info >= (3, 11): + from os import EX_OK as EX_OK + if sys.version_info >= (3, 12): + from os import ( + get_blocking as get_blocking, + listdrives as listdrives, + listmounts as listmounts, + listvolumes as listvolumes, + set_blocking as set_blocking, + ) + + environ: dict[str, str] diff --git a/mypy/typeshed/stdlib/os/__init__.pyi b/mypy/typeshed/stdlib/os/__init__.pyi index 7fd04218fd7c..45eaf2a66e80 100644 --- a/mypy/typeshed/stdlib/os/__init__.pyi +++ b/mypy/typeshed/stdlib/os/__init__.pyi @@ -248,7 +248,7 @@ class _Environ(MutableMapping[AnyStr, AnyStr], Generic[AnyStr]): unsetenv: Callable[[AnyStr, AnyStr], object], ) -> None: ... - def setdefault(self, key: AnyStr, value: AnyStr) -> AnyStr: ... # type: ignore[override] + def setdefault(self, key: AnyStr, value: AnyStr) -> AnyStr: ... def copy(self) -> dict[AnyStr, AnyStr]: ... def __delitem__(self, key: AnyStr) -> None: ... def __getitem__(self, key: AnyStr) -> AnyStr: ... @@ -923,10 +923,16 @@ def times() -> times_result: ... def waitpid(__pid: int, __options: int) -> tuple[int, int]: ... if sys.platform == "win32": - if sys.version_info >= (3, 8): - def startfile(path: StrOrBytesPath, operation: str | None = None) -> None: ... + if sys.version_info >= (3, 10): + def startfile( + filepath: StrOrBytesPath, + operation: str = ..., + arguments: str = "", + cwd: StrOrBytesPath | None = None, + show_cmd: int = 1, + ) -> None: ... else: - def startfile(filepath: StrOrBytesPath, operation: str | None = None) -> None: ... + def startfile(filepath: StrOrBytesPath, operation: str = ...) -> None: ... else: def spawnlp(mode: int, file: StrOrBytesPath, arg0: StrOrBytesPath, *args: StrOrBytesPath) -> int: ... @@ -964,9 +970,9 @@ else: def WTERMSIG(status: int) -> int: ... if sys.version_info >= (3, 8): def posix_spawn( - path: StrOrBytesPath, - argv: _ExecVArgs, - env: _ExecEnv, + __path: StrOrBytesPath, + __argv: _ExecVArgs, + __env: _ExecEnv, *, file_actions: Sequence[tuple[Any, ...]] | None = ..., setpgroup: int | None = ..., @@ -977,9 +983,9 @@ else: scheduler: tuple[Any, sched_param] | None = ..., ) -> int: ... def posix_spawnp( - path: StrOrBytesPath, - argv: _ExecVArgs, - env: _ExecEnv, + __path: StrOrBytesPath, + __argv: _ExecVArgs, + __env: _ExecEnv, *, file_actions: Sequence[tuple[Any, ...]] | None = ..., setpgroup: int | None = ..., diff --git a/mypy/typeshed/stdlib/pkgutil.pyi b/mypy/typeshed/stdlib/pkgutil.pyi index 59f1f734cf90..4a0c8d101b7a 100644 --- a/mypy/typeshed/stdlib/pkgutil.pyi +++ b/mypy/typeshed/stdlib/pkgutil.pyi @@ -3,6 +3,7 @@ from _typeshed import SupportsRead from collections.abc import Callable, Iterable, Iterator from importlib.abc import Loader, MetaPathFinder, PathEntryFinder from typing import IO, Any, NamedTuple, TypeVar +from typing_extensions import deprecated __all__ = [ "get_importer", @@ -35,8 +36,10 @@ if sys.version_info < (3, 12): class ImpLoader: def __init__(self, fullname: str, file: IO[str], filename: str, etc: tuple[str, str, int]) -> None: ... +@deprecated("Use importlib.util.find_spec() instead. Will be removed in Python 3.14.") def find_loader(fullname: str) -> Loader | None: ... def get_importer(path_item: str) -> PathEntryFinder | None: ... +@deprecated("Use importlib.util.find_spec() instead. Will be removed in Python 3.14.") def get_loader(module_or_name: str) -> Loader | None: ... def iter_importers(fullname: str = "") -> Iterator[MetaPathFinder | PathEntryFinder]: ... def iter_modules(path: Iterable[str] | None = None, prefix: str = "") -> Iterator[ModuleInfo]: ... diff --git a/mypy/typeshed/stdlib/re.pyi b/mypy/typeshed/stdlib/re.pyi index 29ee8b66815e..ec532ca3cffe 100644 --- a/mypy/typeshed/stdlib/re.pyi +++ b/mypy/typeshed/stdlib/re.pyi @@ -67,7 +67,7 @@ class Match(Generic[AnyStr]): @overload def expand(self: Match[str], template: str) -> str: ... @overload - def expand(self: Match[bytes], template: ReadableBuffer) -> bytes: ... # type: ignore[misc] + def expand(self: Match[bytes], template: ReadableBuffer) -> bytes: ... # type: ignore[overload-overlap] @overload def expand(self, template: AnyStr) -> AnyStr: ... # group() returns "AnyStr" or "AnyStr | None", depending on the pattern. @@ -117,19 +117,19 @@ class Pattern(Generic[AnyStr]): @overload def search(self: Pattern[str], string: str, pos: int = 0, endpos: int = sys.maxsize) -> Match[str] | None: ... @overload - def search(self: Pattern[bytes], string: ReadableBuffer, pos: int = 0, endpos: int = sys.maxsize) -> Match[bytes] | None: ... # type: ignore[misc] + def search(self: Pattern[bytes], string: ReadableBuffer, pos: int = 0, endpos: int = sys.maxsize) -> Match[bytes] | None: ... # type: ignore[overload-overlap] @overload def search(self, string: AnyStr, pos: int = 0, endpos: int = sys.maxsize) -> Match[AnyStr] | None: ... @overload def match(self: Pattern[str], string: str, pos: int = 0, endpos: int = sys.maxsize) -> Match[str] | None: ... @overload - def match(self: Pattern[bytes], string: ReadableBuffer, pos: int = 0, endpos: int = sys.maxsize) -> Match[bytes] | None: ... # type: ignore[misc] + def match(self: Pattern[bytes], string: ReadableBuffer, pos: int = 0, endpos: int = sys.maxsize) -> Match[bytes] | None: ... # type: ignore[overload-overlap] @overload def match(self, string: AnyStr, pos: int = 0, endpos: int = sys.maxsize) -> Match[AnyStr] | None: ... @overload def fullmatch(self: Pattern[str], string: str, pos: int = 0, endpos: int = sys.maxsize) -> Match[str] | None: ... @overload - def fullmatch(self: Pattern[bytes], string: ReadableBuffer, pos: int = 0, endpos: int = sys.maxsize) -> Match[bytes] | None: ... # type: ignore[misc] + def fullmatch(self: Pattern[bytes], string: ReadableBuffer, pos: int = 0, endpos: int = sys.maxsize) -> Match[bytes] | None: ... # type: ignore[overload-overlap] @overload def fullmatch(self, string: AnyStr, pos: int = 0, endpos: int = sys.maxsize) -> Match[AnyStr] | None: ... @overload @@ -148,13 +148,13 @@ class Pattern(Generic[AnyStr]): @overload def finditer(self: Pattern[str], string: str, pos: int = 0, endpos: int = sys.maxsize) -> Iterator[Match[str]]: ... @overload - def finditer(self: Pattern[bytes], string: ReadableBuffer, pos: int = 0, endpos: int = sys.maxsize) -> Iterator[Match[bytes]]: ... # type: ignore[misc] + def finditer(self: Pattern[bytes], string: ReadableBuffer, pos: int = 0, endpos: int = sys.maxsize) -> Iterator[Match[bytes]]: ... # type: ignore[overload-overlap] @overload def finditer(self, string: AnyStr, pos: int = 0, endpos: int = sys.maxsize) -> Iterator[Match[AnyStr]]: ... @overload def sub(self: Pattern[str], repl: str | Callable[[Match[str]], str], string: str, count: int = 0) -> str: ... @overload - def sub( # type: ignore[misc] + def sub( # type: ignore[overload-overlap] self: Pattern[bytes], repl: ReadableBuffer | Callable[[Match[bytes]], ReadableBuffer], string: ReadableBuffer, @@ -165,7 +165,7 @@ class Pattern(Generic[AnyStr]): @overload def subn(self: Pattern[str], repl: str | Callable[[Match[str]], str], string: str, count: int = 0) -> tuple[str, int]: ... @overload - def subn( # type: ignore[misc] + def subn( # type: ignore[overload-overlap] self: Pattern[bytes], repl: ReadableBuffer | Callable[[Match[bytes]], ReadableBuffer], string: ReadableBuffer, diff --git a/mypy/typeshed/stdlib/shelve.pyi b/mypy/typeshed/stdlib/shelve.pyi index b162b3a85766..59abeafe6fca 100644 --- a/mypy/typeshed/stdlib/shelve.pyi +++ b/mypy/typeshed/stdlib/shelve.pyi @@ -16,7 +16,7 @@ class Shelf(MutableMapping[str, _VT]): def __iter__(self) -> Iterator[str]: ... def __len__(self) -> int: ... @overload # type: ignore[override] - def get(self, key: str, default: None = None) -> _VT | None: ... # type: ignore[misc] # overlapping overloads + def get(self, key: str, default: None = None) -> _VT | None: ... @overload def get(self, key: str, default: _VT) -> _VT: ... @overload @@ -29,6 +29,7 @@ class Shelf(MutableMapping[str, _VT]): def __exit__( self, type: type[BaseException] | None, value: BaseException | None, traceback: TracebackType | None ) -> None: ... + def __del__(self) -> None: ... def close(self) -> None: ... def sync(self) -> None: ... diff --git a/mypy/typeshed/stdlib/shutil.pyi b/mypy/typeshed/stdlib/shutil.pyi index 38c50d51b129..78e930920073 100644 --- a/mypy/typeshed/stdlib/shutil.pyi +++ b/mypy/typeshed/stdlib/shutil.pyi @@ -154,13 +154,13 @@ def disk_usage(path: FileDescriptorOrPath) -> _ntuple_diskusage: ... # see https://bugs.python.org/issue33140. We keep it here because it's # in __all__. @overload -def chown(path: StrOrBytesPath, user: str | int, group: None = None) -> None: ... +def chown(path: FileDescriptorOrPath, user: str | int, group: None = None) -> None: ... @overload -def chown(path: StrOrBytesPath, user: None = None, *, group: str | int) -> None: ... +def chown(path: FileDescriptorOrPath, user: None = None, *, group: str | int) -> None: ... @overload -def chown(path: StrOrBytesPath, user: None, group: str | int) -> None: ... +def chown(path: FileDescriptorOrPath, user: None, group: str | int) -> None: ... @overload -def chown(path: StrOrBytesPath, user: str | int, group: str | int) -> None: ... +def chown(path: FileDescriptorOrPath, user: str | int, group: str | int) -> None: ... if sys.version_info >= (3, 8): @overload diff --git a/mypy/typeshed/stdlib/smtplib.pyi b/mypy/typeshed/stdlib/smtplib.pyi index e584d7f571a7..6db7daebbb41 100644 --- a/mypy/typeshed/stdlib/smtplib.pyi +++ b/mypy/typeshed/stdlib/smtplib.pyi @@ -68,9 +68,9 @@ def quotedata(data: str) -> str: ... class _AuthObject(Protocol): @overload - def __call__(self, challenge: None = None) -> str | None: ... + def __call__(self, __challenge: None = None) -> str | None: ... @overload - def __call__(self, challenge: bytes) -> str: ... + def __call__(self, __challenge: bytes) -> str: ... class SMTP: debuglevel: int diff --git a/mypy/typeshed/stdlib/sqlite3/dbapi2.pyi b/mypy/typeshed/stdlib/sqlite3/dbapi2.pyi index e85f49207763..236e093c9909 100644 --- a/mypy/typeshed/stdlib/sqlite3/dbapi2.pyi +++ b/mypy/typeshed/stdlib/sqlite3/dbapi2.pyi @@ -349,10 +349,10 @@ class Connection: def create_function(self, name: str, num_params: int, func: Callable[..., _SqliteData] | None) -> None: ... @overload - def cursor(self, cursorClass: None = None) -> Cursor: ... + def cursor(self, factory: None = None) -> Cursor: ... @overload - def cursor(self, cursorClass: Callable[[Connection], _CursorT]) -> _CursorT: ... - def execute(self, sql: str, parameters: _Parameters = ...) -> Cursor: ... + def cursor(self, factory: Callable[[Connection], _CursorT]) -> _CursorT: ... + def execute(self, __sql: str, __parameters: _Parameters = ...) -> Cursor: ... def executemany(self, __sql: str, __parameters: Iterable[_Parameters]) -> Cursor: ... def executescript(self, __sql_script: str) -> Cursor: ... def interrupt(self) -> None: ... diff --git a/mypy/typeshed/stdlib/subprocess.pyi b/mypy/typeshed/stdlib/subprocess.pyi index 1013db7ee984..b89623f05c99 100644 --- a/mypy/typeshed/stdlib/subprocess.pyi +++ b/mypy/typeshed/stdlib/subprocess.pyi @@ -248,7 +248,7 @@ if sys.version_info >= (3, 11): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any = None, creationflags: int = 0, restore_signals: bool = True, @@ -260,7 +260,7 @@ if sys.version_info >= (3, 11): encoding: None = None, errors: None = None, input: ReadableBuffer | None = None, - text: Literal[None, False] = None, + text: Literal[False] | None = None, timeout: float | None = None, user: str | int | None = None, group: str | int | None = None, @@ -452,7 +452,7 @@ elif sys.version_info >= (3, 10): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any = None, creationflags: int = 0, restore_signals: bool = True, @@ -464,7 +464,7 @@ elif sys.version_info >= (3, 10): encoding: None = None, errors: None = None, input: ReadableBuffer | None = None, - text: Literal[None, False] = None, + text: Literal[False] | None = None, timeout: float | None = None, user: str | int | None = None, group: str | int | None = None, @@ -650,7 +650,7 @@ elif sys.version_info >= (3, 9): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any = None, creationflags: int = 0, restore_signals: bool = True, @@ -662,7 +662,7 @@ elif sys.version_info >= (3, 9): encoding: None = None, errors: None = None, input: ReadableBuffer | None = None, - text: Literal[None, False] = None, + text: Literal[False] | None = None, timeout: float | None = None, user: str | int | None = None, group: str | int | None = None, @@ -829,7 +829,7 @@ else: shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any = None, creationflags: int = 0, restore_signals: bool = True, @@ -841,7 +841,7 @@ else: encoding: None = None, errors: None = None, input: ReadableBuffer | None = None, - text: Literal[None, False] = None, + text: Literal[False] | None = None, timeout: float | None = None, ) -> CompletedProcess[bytes]: ... @overload @@ -1242,7 +1242,7 @@ if sys.version_info >= (3, 11): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any = None, creationflags: int = 0, restore_signals: bool = True, @@ -1253,7 +1253,7 @@ if sys.version_info >= (3, 11): input: _InputString | None = ..., encoding: None = None, errors: None = None, - text: Literal[None, False] = None, + text: Literal[False] | None = None, user: str | int | None = None, group: str | int | None = None, extra_groups: Iterable[str | int] | None = None, @@ -1428,7 +1428,7 @@ elif sys.version_info >= (3, 10): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any = None, creationflags: int = 0, restore_signals: bool = True, @@ -1439,7 +1439,7 @@ elif sys.version_info >= (3, 10): input: _InputString | None = ..., encoding: None = None, errors: None = None, - text: Literal[None, False] = None, + text: Literal[False] | None = None, user: str | int | None = None, group: str | int | None = None, extra_groups: Iterable[str | int] | None = None, @@ -1608,7 +1608,7 @@ elif sys.version_info >= (3, 9): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any = None, creationflags: int = 0, restore_signals: bool = True, @@ -1619,7 +1619,7 @@ elif sys.version_info >= (3, 9): input: _InputString | None = ..., encoding: None = None, errors: None = None, - text: Literal[None, False] = None, + text: Literal[False] | None = None, user: str | int | None = None, group: str | int | None = None, extra_groups: Iterable[str | int] | None = None, @@ -1769,7 +1769,7 @@ else: shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any = None, creationflags: int = 0, restore_signals: bool = True, @@ -1780,7 +1780,7 @@ else: input: _InputString | None = ..., encoding: None = None, errors: None = None, - text: Literal[None, False] = None, + text: Literal[False] | None = None, ) -> bytes: ... @overload def check_output( @@ -1990,14 +1990,14 @@ class Popen(Generic[AnyStr]): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any | None = None, creationflags: int = 0, restore_signals: bool = True, start_new_session: bool = False, pass_fds: Collection[int] = (), *, - text: Literal[None, False] = None, + text: Literal[False] | None = None, encoding: None = None, errors: None = None, user: str | int | None = None, @@ -2175,14 +2175,14 @@ class Popen(Generic[AnyStr]): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any | None = None, creationflags: int = 0, restore_signals: bool = True, start_new_session: bool = False, pass_fds: Collection[int] = (), *, - text: Literal[None, False] = None, + text: Literal[False] | None = None, encoding: None = None, errors: None = None, user: str | int | None = None, @@ -2354,14 +2354,14 @@ class Popen(Generic[AnyStr]): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any | None = None, creationflags: int = 0, restore_signals: bool = True, start_new_session: bool = False, pass_fds: Collection[int] = (), *, - text: Literal[None, False] = None, + text: Literal[False] | None = None, encoding: None = None, errors: None = None, user: str | int | None = None, @@ -2514,14 +2514,14 @@ class Popen(Generic[AnyStr]): shell: bool = False, cwd: StrOrBytesPath | None = None, env: _ENV | None = None, - universal_newlines: Literal[False, None] = None, + universal_newlines: Literal[False] | None = None, startupinfo: Any | None = None, creationflags: int = 0, restore_signals: bool = True, start_new_session: bool = False, pass_fds: Collection[int] = (), *, - text: Literal[None, False] = None, + text: Literal[False] | None = None, encoding: None = None, errors: None = None, ) -> None: ... @@ -2564,6 +2564,7 @@ class Popen(Generic[AnyStr]): def __exit__( self, exc_type: type[BaseException] | None, value: BaseException | None, traceback: TracebackType | None ) -> None: ... + def __del__(self) -> None: ... if sys.version_info >= (3, 9): def __class_getitem__(cls, item: Any) -> GenericAlias: ... diff --git a/mypy/typeshed/stdlib/sunau.pyi b/mypy/typeshed/stdlib/sunau.pyi index 6109b368c01a..b508a1ea8e20 100644 --- a/mypy/typeshed/stdlib/sunau.pyi +++ b/mypy/typeshed/stdlib/sunau.pyi @@ -34,6 +34,7 @@ class Au_read: def __init__(self, f: _File) -> None: ... def __enter__(self) -> Self: ... def __exit__(self, *args: Unused) -> None: ... + def __del__(self) -> None: ... def getfp(self) -> IO[bytes] | None: ... def rewind(self) -> None: ... def close(self) -> None: ... @@ -54,6 +55,7 @@ class Au_write: def __init__(self, f: _File) -> None: ... def __enter__(self) -> Self: ... def __exit__(self, *args: Unused) -> None: ... + def __del__(self) -> None: ... def setnchannels(self, nchannels: int) -> None: ... def getnchannels(self) -> int: ... def setsampwidth(self, sampwidth: int) -> None: ... diff --git a/mypy/typeshed/stdlib/sys.pyi b/mypy/typeshed/stdlib/sys/__init__.pyi similarity index 98% rename from mypy/typeshed/stdlib/sys.pyi rename to mypy/typeshed/stdlib/sys/__init__.pyi index a5e819d53326..1d4111af3a49 100644 --- a/mypy/typeshed/stdlib/sys.pyi +++ b/mypy/typeshed/stdlib/sys/__init__.pyi @@ -17,7 +17,9 @@ _OptExcInfo: TypeAlias = OptExcInfo # noqa: Y047 # TODO: obsolete, remove fall # Intentionally omits one deprecated and one optional method of `importlib.abc.MetaPathFinder` class _MetaPathFinder(Protocol): - def find_spec(self, fullname: str, path: Sequence[str] | None, target: ModuleType | None = ...) -> ModuleSpec | None: ... + def find_spec( + self, __fullname: str, __path: Sequence[str] | None, __target: ModuleType | None = ... + ) -> ModuleSpec | None: ... # ----- sys variables ----- if sys.platform != "win32": @@ -370,3 +372,7 @@ if sys.version_info >= (3, 12): def activate_stack_trampoline(__backend: str) -> None: ... else: def activate_stack_trampoline(__backend: str) -> NoReturn: ... + + from . import _monitoring + + monitoring = _monitoring diff --git a/mypy/typeshed/stdlib/sys/_monitoring.pyi b/mypy/typeshed/stdlib/sys/_monitoring.pyi new file mode 100644 index 000000000000..40aeb9cb5bdb --- /dev/null +++ b/mypy/typeshed/stdlib/sys/_monitoring.pyi @@ -0,0 +1,52 @@ +# This py312+ module provides annotations for `sys.monitoring`. +# It's named `sys._monitoring` in typeshed, +# because trying to import `sys.monitoring` will fail at runtime! +# At runtime, `sys.monitoring` has the unique status +# of being a `types.ModuleType` instance that cannot be directly imported, +# and exists in the `sys`-module namespace despite `sys` not being a package. + +from collections.abc import Callable +from types import CodeType +from typing import Any + +DEBUGGER_ID: int +COVERAGE_ID: int +PROFILER_ID: int +OPTIMIZER_ID: int + +def use_tool_id(__tool_id: int, __name: str) -> None: ... +def free_tool_id(__tool_id: int) -> None: ... +def get_tool(__tool_id: int) -> str | None: ... + +events: _events + +class _events: + BRANCH: int + CALL: int + C_RAISE: int + C_RETURN: int + EXCEPTION_HANDLED: int + INSTRUCTION: int + JUMP: int + LINE: int + NO_EVENTS: int + PY_RESUME: int + PY_RETURN: int + PY_START: int + PY_THROW: int + PY_UNWIND: int + PY_YIELD: int + RAISE: int + RERAISE: int + STOP_ITERATION: int + +def get_events(__tool_id: int) -> int: ... +def set_events(__tool_id: int, __event_set: int) -> None: ... +def get_local_events(__tool_id: int, __code: CodeType) -> int: ... +def set_local_events(__tool_id: int, __code: CodeType, __event_set: int) -> int: ... +def restart_events() -> None: ... + +DISABLE: object +MISSING: object + +def register_callback(__tool_id: int, __event: int, __func: Callable[..., Any] | None) -> Callable[..., Any] | None: ... diff --git a/mypy/typeshed/stdlib/telnetlib.pyi b/mypy/typeshed/stdlib/telnetlib.pyi index 10f6e4930f75..d244d54f2fbf 100644 --- a/mypy/typeshed/stdlib/telnetlib.pyi +++ b/mypy/typeshed/stdlib/telnetlib.pyi @@ -119,3 +119,4 @@ class Telnet: def __exit__( self, type: type[BaseException] | None, value: BaseException | None, traceback: TracebackType | None ) -> None: ... + def __del__(self) -> None: ... diff --git a/mypy/typeshed/stdlib/tempfile.pyi b/mypy/typeshed/stdlib/tempfile.pyi index 61bcde24255b..628f99410732 100644 --- a/mypy/typeshed/stdlib/tempfile.pyi +++ b/mypy/typeshed/stdlib/tempfile.pyi @@ -321,7 +321,7 @@ else: dir: GenericPath[AnyStr] | None = None, ) -> IO[Any]: ... -class _TemporaryFileWrapper(IO[AnyStr], Generic[AnyStr]): +class _TemporaryFileWrapper(IO[AnyStr]): file: IO[AnyStr] # io.TextIOWrapper, io.BufferedReader or io.BufferedWriter name: str delete: bool @@ -629,7 +629,7 @@ class TemporaryDirectory(Generic[AnyStr]): # The overloads overlap, but they should still work fine. @overload -def mkstemp( # type: ignore[misc] +def mkstemp( # type: ignore[overload-overlap] suffix: str | None = None, prefix: str | None = None, dir: StrPath | None = None, text: bool = False ) -> tuple[int, str]: ... @overload @@ -639,7 +639,7 @@ def mkstemp( # The overloads overlap, but they should still work fine. @overload -def mkdtemp(suffix: str | None = None, prefix: str | None = None, dir: StrPath | None = None) -> str: ... # type: ignore[misc] +def mkdtemp(suffix: str | None = None, prefix: str | None = None, dir: StrPath | None = None) -> str: ... # type: ignore[overload-overlap] @overload def mkdtemp(suffix: bytes | None = None, prefix: bytes | None = None, dir: BytesPath | None = None) -> bytes: ... def mktemp(suffix: str = "", prefix: str = "tmp", dir: StrPath | None = None) -> str: ... diff --git a/mypy/typeshed/stdlib/tkinter/__init__.pyi b/mypy/typeshed/stdlib/tkinter/__init__.pyi index a0a88a8ac82e..a73b1e275f11 100644 --- a/mypy/typeshed/stdlib/tkinter/__init__.pyi +++ b/mypy/typeshed/stdlib/tkinter/__init__.pyi @@ -7,7 +7,7 @@ from tkinter.constants import * from tkinter.font import _FontDescription from types import TracebackType from typing import Any, Generic, NamedTuple, TypeVar, overload, type_check_only -from typing_extensions import Literal, TypeAlias, TypedDict +from typing_extensions import Literal, TypeAlias, TypedDict, deprecated if sys.version_info >= (3, 9): __all__ = [ @@ -273,11 +273,16 @@ class Variable: def trace_add(self, mode: _TraceMode, callback: Callable[[str, str, str], object]) -> str: ... def trace_remove(self, mode: _TraceMode, cbname: str) -> None: ... def trace_info(self) -> list[tuple[tuple[_TraceMode, ...], str]]: ... - def trace_variable(self, mode, callback): ... # deprecated - def trace_vdelete(self, mode, cbname) -> None: ... # deprecated - def trace_vinfo(self): ... # deprecated - trace = trace_variable # deprecated + @deprecated("use trace_add() instead of trace()") + def trace(self, mode, callback): ... + @deprecated("use trace_add() instead of trace_variable()") + def trace_variable(self, mode, callback): ... + @deprecated("use trace_remove() instead of trace_vdelete()") + def trace_vdelete(self, mode, cbname) -> None: ... + @deprecated("use trace_info() instead of trace_vinfo()") + def trace_vinfo(self): ... def __eq__(self, other: object) -> bool: ... + def __del__(self) -> None: ... class StringVar(Variable): def __init__(self, master: Misc | None = None, value: str | None = None, name: str | None = None) -> None: ... @@ -343,9 +348,8 @@ class Misc: def tk_focusFollowsMouse(self) -> None: ... def tk_focusNext(self) -> Misc | None: ... def tk_focusPrev(self) -> Misc | None: ... - @overload - def after(self, ms: int, func: None = None) -> None: ... - @overload + # .after() can be called without the "func" argument, but it is basically never what you want. + # It behaves like time.sleep() and freezes the GUI app. def after(self, ms: int | Literal["idle"], func: Callable[..., object], *args: Any) -> str: ... # after_idle is essentially partialmethod(after, "idle") def after_idle(self, func: Callable[..., object], *args: Any) -> str: ... @@ -2888,7 +2892,7 @@ class Scrollbar(Widget): def fraction(self, x: int, y: int) -> float: ... def identify(self, x: int, y: int) -> Literal["arrow1", "arrow2", "slider", "trough1", "trough2", ""]: ... def get(self) -> tuple[float, float, float, float] | tuple[float, float]: ... - def set(self, first: float, last: float) -> None: ... + def set(self, first: float | str, last: float | str) -> None: ... _TextIndex: TypeAlias = _tkinter.Tcl_Obj | str | float | Misc @@ -3064,11 +3068,40 @@ class Text(Widget, XView, YView): def edit_separator(self) -> None: ... # actually returns empty string def edit_undo(self) -> None: ... # actually returns empty string def get(self, index1: _TextIndex, index2: _TextIndex | None = None) -> str: ... - # TODO: image_* methods - def image_cget(self, index, option): ... - def image_configure(self, index, cnf: Incomplete | None = None, **kw): ... - def image_create(self, index, cnf={}, **kw): ... - def image_names(self): ... + @overload + def image_cget(self, index: _TextIndex, option: Literal["image", "name"]) -> str: ... + @overload + def image_cget(self, index: _TextIndex, option: Literal["padx", "pady"]) -> int: ... + @overload + def image_cget(self, index: _TextIndex, option: Literal["align"]) -> Literal["baseline", "bottom", "center", "top"]: ... + @overload + def image_cget(self, index: _TextIndex, option: str) -> Any: ... + @overload + def image_configure(self, index: _TextIndex, cnf: str) -> tuple[str, str, str, str, str | int]: ... + @overload + def image_configure( + self, + index: _TextIndex, + cnf: dict[str, Any] | None = {}, + *, + align: Literal["baseline", "bottom", "center", "top"] = ..., + image: _ImageSpec = ..., + name: str = ..., + padx: _ScreenUnits = ..., + pady: _ScreenUnits = ..., + ) -> dict[str, tuple[str, str, str, str, str | int]] | None: ... + def image_create( + self, + index: _TextIndex, + cnf: dict[str, Any] | None = {}, + *, + align: Literal["baseline", "bottom", "center", "top"] = ..., + image: _ImageSpec = ..., + name: str = ..., + padx: _ScreenUnits = ..., + pady: _ScreenUnits = ..., + ) -> str: ... + def image_names(self) -> tuple[str, ...]: ... def index(self, index: _TextIndex) -> str: ... def insert(self, index: _TextIndex, chars: str, *args: str | list[str] | tuple[str, ...]) -> None: ... @overload @@ -3166,12 +3199,45 @@ class Text(Widget, XView, YView): def tag_ranges(self, tagName: str) -> tuple[_tkinter.Tcl_Obj, ...]: ... # tag_remove and tag_delete are different def tag_remove(self, tagName: str, index1: _TextIndex, index2: _TextIndex | None = None) -> None: ... - # TODO: window_* methods - def window_cget(self, index, option): ... - def window_configure(self, index, cnf: Incomplete | None = None, **kw): ... + @overload + def window_cget(self, index: _TextIndex, option: Literal["padx", "pady"]) -> int: ... + @overload + def window_cget(self, index: _TextIndex, option: Literal["stretch"]) -> bool: ... # actually returns Literal[0, 1] + @overload + def window_cget(self, index: _TextIndex, option: Literal["align"]) -> Literal["baseline", "bottom", "center", "top"]: ... + @overload # window is set to a widget, but read as the string name. + def window_cget(self, index: _TextIndex, option: Literal["create", "window"]) -> str: ... + @overload + def window_cget(self, index: _TextIndex, option: str) -> Any: ... + @overload + def window_configure(self, index: _TextIndex, cnf: str) -> tuple[str, str, str, str, str | int]: ... + @overload + def window_configure( + self, + index: _TextIndex, + cnf: dict[str, Any] | None = None, + *, + align: Literal["baseline", "bottom", "center", "top"] = ..., + create: str = ..., + padx: _ScreenUnits = ..., + pady: _ScreenUnits = ..., + stretch: bool | Literal[0, 1] = ..., + window: Misc | str = ..., + ) -> dict[str, tuple[str, str, str, str, str | int]] | None: ... window_config = window_configure - def window_create(self, index, cnf={}, **kw) -> None: ... - def window_names(self): ... + def window_create( + self, + index: _TextIndex, + cnf: dict[str, Any] | None = {}, + *, + align: Literal["baseline", "bottom", "center", "top"] = ..., + create: str = ..., + padx: _ScreenUnits = ..., + pady: _ScreenUnits = ..., + stretch: bool | Literal[0, 1] = ..., + window: Misc | str = ..., + ) -> None: ... + def window_names(self) -> tuple[str, ...]: ... def yview_pickplace(self, *what): ... # deprecated class _setit: diff --git a/mypy/typeshed/stdlib/tkinter/dnd.pyi b/mypy/typeshed/stdlib/tkinter/dnd.pyi index 4a6ab42b3e33..5a83bb56679f 100644 --- a/mypy/typeshed/stdlib/tkinter/dnd.pyi +++ b/mypy/typeshed/stdlib/tkinter/dnd.pyi @@ -6,7 +6,7 @@ if sys.version_info >= (3, 9): __all__ = ["dnd_start", "DndHandler"] class _DndSource(Protocol): - def dnd_end(self, target: Widget | None, event: Event[Misc] | None) -> None: ... + def dnd_end(self, __target: Widget | None, __event: Event[Misc] | None) -> None: ... class DndHandler: root: ClassVar[Tk | None] @@ -15,5 +15,6 @@ class DndHandler: def finish(self, event: Event[Misc] | None, commit: int = 0) -> None: ... def on_motion(self, event: Event[Misc]) -> None: ... def on_release(self, event: Event[Misc]) -> None: ... + def __del__(self) -> None: ... def dnd_start(source: _DndSource, event: Event[Misc]) -> DndHandler | None: ... diff --git a/mypy/typeshed/stdlib/tkinter/font.pyi b/mypy/typeshed/stdlib/tkinter/font.pyi index 0a557e921914..9dffcd1ba0c6 100644 --- a/mypy/typeshed/stdlib/tkinter/font.pyi +++ b/mypy/typeshed/stdlib/tkinter/font.pyi @@ -101,6 +101,7 @@ class Font: def metrics(self, *, displayof: tkinter.Misc | None = ...) -> _MetricsDict: ... def measure(self, text: str, displayof: tkinter.Misc | None = None) -> int: ... def __eq__(self, other: object) -> bool: ... + def __del__(self) -> None: ... def families(root: tkinter.Misc | None = None, displayof: tkinter.Misc | None = None) -> tuple[str, ...]: ... def names(root: tkinter.Misc | None = None) -> tuple[str, ...]: ... diff --git a/mypy/typeshed/stdlib/tkinter/ttk.pyi b/mypy/typeshed/stdlib/tkinter/ttk.pyi index bb416717a378..2bbbafbcb945 100644 --- a/mypy/typeshed/stdlib/tkinter/ttk.pyi +++ b/mypy/typeshed/stdlib/tkinter/ttk.pyi @@ -1039,7 +1039,7 @@ class Treeview(Widget, tkinter.XView, tkinter.YView): @overload def heading(self, column: str | int, option: str) -> Any: ... @overload - def heading(self, column: str | int, option: None = None) -> _TreeviewHeaderDict: ... # type: ignore[misc] + def heading(self, column: str | int, option: None = None) -> _TreeviewHeaderDict: ... # type: ignore[overload-overlap] @overload def heading( self, @@ -1083,7 +1083,7 @@ class Treeview(Widget, tkinter.XView, tkinter.YView): @overload def item(self, item: str | int, option: str) -> Any: ... @overload - def item(self, item: str | int, option: None = None) -> _TreeviewItemDict: ... # type: ignore[misc] + def item(self, item: str | int, option: None = None) -> _TreeviewItemDict: ... # type: ignore[overload-overlap] @overload def item( self, diff --git a/mypy/typeshed/stdlib/turtle.pyi b/mypy/typeshed/stdlib/turtle.pyi index 36cd5f1f6e9d..fd0723fd73ed 100644 --- a/mypy/typeshed/stdlib/turtle.pyi +++ b/mypy/typeshed/stdlib/turtle.pyi @@ -336,7 +336,7 @@ class TPen: def isvisible(self) -> bool: ... # Note: signatures 1 and 2 overlap unsafely when no arguments are provided @overload - def pen(self) -> _PenState: ... # type: ignore[misc] + def pen(self) -> _PenState: ... # type: ignore[overload-overlap] @overload def pen( self, @@ -382,7 +382,7 @@ class RawTurtle(TPen, TNavigator): def shape(self, name: str) -> None: ... # Unsafely overlaps when no arguments are provided @overload - def shapesize(self) -> tuple[float, float, float]: ... # type: ignore[misc] + def shapesize(self) -> tuple[float, float, float]: ... # type: ignore[overload-overlap] @overload def shapesize( self, stretch_wid: float | None = None, stretch_len: float | None = None, outline: float | None = None @@ -393,7 +393,7 @@ class RawTurtle(TPen, TNavigator): def shearfactor(self, shear: float) -> None: ... # Unsafely overlaps when no arguments are provided @overload - def shapetransform(self) -> tuple[float, float, float, float]: ... # type: ignore[misc] + def shapetransform(self) -> tuple[float, float, float, float]: ... # type: ignore[overload-overlap] @overload def shapetransform( self, t11: float | None = None, t12: float | None = None, t21: float | None = None, t22: float | None = None @@ -617,7 +617,7 @@ def isvisible() -> bool: ... # Note: signatures 1 and 2 overlap unsafely when no arguments are provided @overload -def pen() -> _PenState: ... # type: ignore[misc] +def pen() -> _PenState: ... # type: ignore[overload-overlap] @overload def pen( pen: _PenState | None = None, @@ -656,7 +656,7 @@ if sys.version_info >= (3, 12): # Unsafely overlaps when no arguments are provided @overload -def shapesize() -> tuple[float, float, float]: ... # type: ignore[misc] +def shapesize() -> tuple[float, float, float]: ... # type: ignore[overload-overlap] @overload def shapesize(stretch_wid: float | None = None, stretch_len: float | None = None, outline: float | None = None) -> None: ... @overload @@ -666,7 +666,7 @@ def shearfactor(shear: float) -> None: ... # Unsafely overlaps when no arguments are provided @overload -def shapetransform() -> tuple[float, float, float, float]: ... # type: ignore[misc] +def shapetransform() -> tuple[float, float, float, float]: ... # type: ignore[overload-overlap] @overload def shapetransform( t11: float | None = None, t12: float | None = None, t21: float | None = None, t22: float | None = None diff --git a/mypy/typeshed/stdlib/types.pyi b/mypy/typeshed/stdlib/types.pyi index 8559063834c9..b26a668d273b 100644 --- a/mypy/typeshed/stdlib/types.pyi +++ b/mypy/typeshed/stdlib/types.pyi @@ -16,7 +16,7 @@ from collections.abc import ( from importlib.machinery import ModuleSpec # pytype crashes if types.MappingProxyType inherits from collections.abc.Mapping instead of typing.Mapping -from typing import Any, ClassVar, Generic, Mapping, Protocol, TypeVar, overload # noqa: Y022 +from typing import Any, ClassVar, Mapping, Protocol, TypeVar, overload # noqa: Y022 from typing_extensions import Literal, ParamSpec, Self, TypeVarTuple, final __all__ = [ @@ -309,7 +309,7 @@ class CodeType: ) -> CodeType: ... @final -class MappingProxyType(Mapping[_KT, _VT_co], Generic[_KT, _VT_co]): +class MappingProxyType(Mapping[_KT, _VT_co]): __hash__: ClassVar[None] # type: ignore[assignment] def __new__(cls, mapping: SupportsKeysAndGetItem[_KT, _VT_co]) -> Self: ... def __getitem__(self, __key: _KT) -> _VT_co: ... @@ -335,7 +335,7 @@ class SimpleNamespace: def __delattr__(self, __name: str) -> None: ... class _LoaderProtocol(Protocol): - def load_module(self, fullname: str) -> ModuleType: ... + def load_module(self, __fullname: str) -> ModuleType: ... class ModuleType: __name__: str @@ -593,9 +593,8 @@ _R = TypeVar("_R") _P = ParamSpec("_P") # it's not really an Awaitable, but can be used in an await expression. Real type: Generator & Awaitable -# The type: ignore is due to overlapping overloads, not the use of ParamSpec @overload -def coroutine(func: Callable[_P, Generator[Any, Any, _R]]) -> Callable[_P, Awaitable[_R]]: ... # type: ignore[misc] +def coroutine(func: Callable[_P, Generator[Any, Any, _R]]) -> Callable[_P, Awaitable[_R]]: ... # type: ignore[overload-overlap] @overload def coroutine(func: _Fn) -> _Fn: ... diff --git a/mypy/typeshed/stdlib/typing.pyi b/mypy/typeshed/stdlib/typing.pyi index 6deb0ffd02b3..555df0ea47c8 100644 --- a/mypy/typeshed/stdlib/typing.pyi +++ b/mypy/typeshed/stdlib/typing.pyi @@ -281,7 +281,12 @@ if sys.version_info >= (3, 10): class NewType: def __init__(self, name: str, tp: Any) -> None: ... - def __call__(self, __x: _T) -> _T: ... + if sys.version_info >= (3, 11): + @staticmethod + def __call__(__x: _T) -> _T: ... + else: + def __call__(self, x: _T) -> _T: ... + def __or__(self, other: Any) -> _SpecialForm: ... def __ror__(self, other: Any) -> _SpecialForm: ... __supertype__: type @@ -513,7 +518,7 @@ class Collection(Iterable[_T_co], Container[_T_co], Protocol[_T_co]): @abstractmethod def __len__(self) -> int: ... -class Sequence(Collection[_T_co], Reversible[_T_co], Generic[_T_co]): +class Sequence(Collection[_T_co], Reversible[_T_co]): @overload @abstractmethod def __getitem__(self, index: int) -> _T_co: ... @@ -527,7 +532,7 @@ class Sequence(Collection[_T_co], Reversible[_T_co], Generic[_T_co]): def __iter__(self) -> Iterator[_T_co]: ... def __reversed__(self) -> Iterator[_T_co]: ... -class MutableSequence(Sequence[_T], Generic[_T]): +class MutableSequence(Sequence[_T]): @abstractmethod def insert(self, index: int, value: _T) -> None: ... @overload @@ -557,7 +562,7 @@ class MutableSequence(Sequence[_T], Generic[_T]): def remove(self, value: _T) -> None: ... def __iadd__(self, values: Iterable[_T]) -> typing_extensions.Self: ... -class AbstractSet(Collection[_T_co], Generic[_T_co]): +class AbstractSet(Collection[_T_co]): @abstractmethod def __contains__(self, x: object) -> bool: ... def _hash(self) -> int: ... @@ -573,7 +578,7 @@ class AbstractSet(Collection[_T_co], Generic[_T_co]): def __eq__(self, other: object) -> bool: ... def isdisjoint(self, other: Iterable[Any]) -> bool: ... -class MutableSet(AbstractSet[_T], Generic[_T]): +class MutableSet(AbstractSet[_T]): @abstractmethod def add(self, value: _T) -> None: ... @abstractmethod @@ -607,7 +612,7 @@ class ItemsView(MappingView, AbstractSet[tuple[_KT_co, _VT_co]], Generic[_KT_co, def __xor__(self, other: Iterable[_T]) -> set[tuple[_KT_co, _VT_co] | _T]: ... def __rxor__(self, other: Iterable[_T]) -> set[tuple[_KT_co, _VT_co] | _T]: ... -class KeysView(MappingView, AbstractSet[_KT_co], Generic[_KT_co]): +class KeysView(MappingView, AbstractSet[_KT_co]): def __init__(self, mapping: Mapping[_KT_co, Any]) -> None: ... # undocumented def __and__(self, other: Iterable[Any]) -> set[_KT_co]: ... def __rand__(self, other: Iterable[_T]) -> set[_T]: ... @@ -623,7 +628,7 @@ class KeysView(MappingView, AbstractSet[_KT_co], Generic[_KT_co]): def __xor__(self, other: Iterable[_T]) -> set[_KT_co | _T]: ... def __rxor__(self, other: Iterable[_T]) -> set[_KT_co | _T]: ... -class ValuesView(MappingView, Collection[_VT_co], Generic[_VT_co]): +class ValuesView(MappingView, Collection[_VT_co]): def __init__(self, mapping: Mapping[Any, _VT_co]) -> None: ... # undocumented def __contains__(self, value: object) -> bool: ... def __iter__(self) -> Iterator[_VT_co]: ... @@ -646,7 +651,7 @@ class Mapping(Collection[_KT], Generic[_KT, _VT_co]): def __contains__(self, __key: object) -> bool: ... def __eq__(self, __other: object) -> bool: ... -class MutableMapping(Mapping[_KT, _VT], Generic[_KT, _VT]): +class MutableMapping(Mapping[_KT, _VT]): @abstractmethod def __setitem__(self, __key: _KT, __value: _VT) -> None: ... @abstractmethod @@ -703,7 +708,7 @@ TYPE_CHECKING: bool # In stubs, the arguments of the IO class are marked as positional-only. # This differs from runtime, but better reflects the fact that in reality # classes deriving from IO use different names for the arguments. -class IO(Iterator[AnyStr], Generic[AnyStr]): +class IO(Iterator[AnyStr]): # At runtime these are all abstract properties, # but making them abstract in the stub is hugely disruptive, for not much gain. # See #8726 @@ -970,9 +975,8 @@ if sys.version_info >= (3, 12): @property def __module__(self) -> str | None: ... # type: ignore[override] def __getitem__(self, parameters: Any) -> Any: ... - if sys.version_info >= (3, 10): - def __or__(self, right: Any) -> _SpecialForm: ... - def __ror__(self, left: Any) -> _SpecialForm: ... + def __or__(self, right: Any) -> _SpecialForm: ... + def __ror__(self, left: Any) -> _SpecialForm: ... if sys.version_info >= (3, 13): def is_protocol(__tp: type) -> bool: ... diff --git a/mypy/typeshed/stdlib/typing_extensions.pyi b/mypy/typeshed/stdlib/typing_extensions.pyi index b5e2341cd020..5c5b756f5256 100644 --- a/mypy/typeshed/stdlib/typing_extensions.pyi +++ b/mypy/typeshed/stdlib/typing_extensions.pyi @@ -182,6 +182,7 @@ __all__ = [ "is_protocol", "no_type_check", "no_type_check_decorator", + "ReadOnly", ] _T = typing.TypeVar("_T") @@ -220,6 +221,8 @@ def IntVar(name: str) -> Any: ... # returns a new TypeVar class _TypedDict(Mapping[str, object], metaclass=abc.ABCMeta): __required_keys__: ClassVar[frozenset[str]] __optional_keys__: ClassVar[frozenset[str]] + __readonly_keys__: ClassVar[frozenset[str]] + __mutable_keys__: ClassVar[frozenset[str]] __total__: ClassVar[bool] __orig_bases__: ClassVar[tuple[Any, ...]] def copy(self) -> Self: ... @@ -283,7 +286,6 @@ class SupportsIndex(Protocol, metaclass=abc.ABCMeta): if sys.version_info >= (3, 10): from typing import ( Concatenate as Concatenate, - NewType as NewType, ParamSpecArgs as ParamSpecArgs, ParamSpecKwargs as ParamSpecKwargs, TypeAlias as TypeAlias, @@ -308,18 +310,13 @@ else: TypeGuard: _SpecialForm def is_typeddict(tp: object) -> bool: ... - class NewType: - def __init__(self, name: str, tp: Any) -> None: ... - def __call__(self, __x: _T) -> _T: ... - __supertype__: type - -# New things in 3.11 -# NamedTuples are not new, but the ability to create generic NamedTuples is new in 3.11 +# New and changed things in 3.11 if sys.version_info >= (3, 11): from typing import ( LiteralString as LiteralString, NamedTuple as NamedTuple, Never as Never, + NewType as NewType, NotRequired as NotRequired, Required as Required, Self as Self, @@ -376,6 +373,14 @@ else: def _replace(self, **kwargs: Any) -> Self: ... + class NewType: + def __init__(self, name: str, tp: Any) -> None: ... + def __call__(self, __obj: _T) -> _T: ... + __supertype__: type + if sys.version_info >= (3, 10): + def __or__(self, other: Any) -> _SpecialForm: ... + def __ror__(self, other: Any) -> _SpecialForm: ... + # New things in 3.xx # The `default` parameter was added to TypeVar, ParamSpec, and TypeVarTuple (PEP 696) # The `infer_variance` parameter was added to TypeVar in 3.12 (PEP 695) @@ -449,7 +454,12 @@ class TypeVarTuple: def __init__(self, name: str, *, default: Any | None = None) -> None: ... def __iter__(self) -> Any: ... # Unpack[Self] -def deprecated(__msg: str, *, category: type[Warning] | None = ..., stacklevel: int = 1) -> Callable[[_T], _T]: ... +class deprecated: + message: str + category: type[Warning] | None + stacklevel: int + def __init__(self, __message: str, *, category: type[Warning] | None = ..., stacklevel: int = 1) -> None: ... + def __call__(self, __arg: _T) -> _T: ... if sys.version_info >= (3, 12): from collections.abc import Buffer as Buffer @@ -496,3 +506,5 @@ class Doc: def __init__(self, __documentation: str) -> None: ... def __hash__(self) -> int: ... def __eq__(self, other: object) -> bool: ... + +ReadOnly: _SpecialForm diff --git a/mypy/typeshed/stdlib/unittest/async_case.pyi b/mypy/typeshed/stdlib/unittest/async_case.pyi index c1de205fbd55..b71eec2e0644 100644 --- a/mypy/typeshed/stdlib/unittest/async_case.pyi +++ b/mypy/typeshed/stdlib/unittest/async_case.pyi @@ -17,3 +17,5 @@ class IsolatedAsyncioTestCase(TestCase): def addAsyncCleanup(self, __func: Callable[_P, Awaitable[object]], *args: _P.args, **kwargs: _P.kwargs) -> None: ... if sys.version_info >= (3, 11): async def enterAsyncContext(self, cm: AbstractAsyncContextManager[_T]) -> _T: ... + if sys.version_info >= (3, 9): + def __del__(self) -> None: ... diff --git a/mypy/typeshed/stdlib/unittest/case.pyi b/mypy/typeshed/stdlib/unittest/case.pyi index aa04e16d62ec..cc5d683e245a 100644 --- a/mypy/typeshed/stdlib/unittest/case.pyi +++ b/mypy/typeshed/stdlib/unittest/case.pyi @@ -52,7 +52,7 @@ else: ) -> bool | None: ... if sys.version_info >= (3, 8): - def addModuleCleanup(__function: Callable[_P, Any], *args: _P.args, **kwargs: _P.kwargs) -> None: ... + def addModuleCleanup(__function: Callable[_P, object], *args: _P.args, **kwargs: _P.kwargs) -> None: ... def doModuleCleanups() -> None: ... if sys.version_info >= (3, 11): @@ -136,7 +136,7 @@ class TestCase: def assertRaises( self, expected_exception: type[BaseException] | tuple[type[BaseException], ...], - callable: Callable[..., Any], + callable: Callable[..., object], *args: Any, **kwargs: Any, ) -> None: ... @@ -149,7 +149,7 @@ class TestCase: self, expected_exception: type[BaseException] | tuple[type[BaseException], ...], expected_regex: str | Pattern[str], - callable: Callable[..., Any], + callable: Callable[..., object], *args: Any, **kwargs: Any, ) -> None: ... @@ -161,7 +161,7 @@ class TestCase: def assertWarns( self, expected_warning: type[Warning] | tuple[type[Warning], ...], - callable: Callable[_P, Any], + callable: Callable[_P, object], *args: _P.args, **kwargs: _P.kwargs, ) -> None: ... @@ -174,7 +174,7 @@ class TestCase: self, expected_warning: type[Warning] | tuple[type[Warning], ...], expected_regex: str | Pattern[str], - callable: Callable[_P, Any], + callable: Callable[_P, object], *args: _P.args, **kwargs: _P.kwargs, ) -> None: ... @@ -256,9 +256,9 @@ class TestCase: def id(self) -> str: ... def shortDescription(self) -> str | None: ... if sys.version_info >= (3, 8): - def addCleanup(self, __function: Callable[_P, Any], *args: _P.args, **kwargs: _P.kwargs) -> None: ... + def addCleanup(self, __function: Callable[_P, object], *args: _P.args, **kwargs: _P.kwargs) -> None: ... else: - def addCleanup(self, function: Callable[_P, Any], *args: _P.args, **kwargs: _P.kwargs) -> None: ... + def addCleanup(self, function: Callable[_P, object], *args: _P.args, **kwargs: _P.kwargs) -> None: ... if sys.version_info >= (3, 11): def enterContext(self, cm: AbstractContextManager[_T]) -> _T: ... @@ -266,7 +266,7 @@ class TestCase: def doCleanups(self) -> None: ... if sys.version_info >= (3, 8): @classmethod - def addClassCleanup(cls, __function: Callable[_P, Any], *args: _P.args, **kwargs: _P.kwargs) -> None: ... + def addClassCleanup(cls, __function: Callable[_P, object], *args: _P.args, **kwargs: _P.kwargs) -> None: ... @classmethod def doClassCleanups(cls) -> None: ... @@ -299,9 +299,9 @@ class TestCase: class FunctionTestCase(TestCase): def __init__( self, - testFunc: Callable[[], Any], - setUp: Callable[[], Any] | None = None, - tearDown: Callable[[], Any] | None = None, + testFunc: Callable[[], object], + setUp: Callable[[], object] | None = None, + tearDown: Callable[[], object] | None = None, description: str | None = None, ) -> None: ... def runTest(self) -> None: ... diff --git a/mypy/typeshed/stdlib/unittest/main.pyi b/mypy/typeshed/stdlib/unittest/main.pyi index d29e9a2b8da8..3e8cb7b764c2 100644 --- a/mypy/typeshed/stdlib/unittest/main.pyi +++ b/mypy/typeshed/stdlib/unittest/main.pyi @@ -11,7 +11,7 @@ MAIN_EXAMPLES: str MODULE_EXAMPLES: str class _TestRunner(Protocol): - def run(self, test: unittest.suite.TestSuite | unittest.case.TestCase) -> unittest.result.TestResult: ... + def run(self, __test: unittest.suite.TestSuite | unittest.case.TestCase) -> unittest.result.TestResult: ... # not really documented class TestProgram: diff --git a/mypy/typeshed/stdlib/unittest/mock.pyi b/mypy/typeshed/stdlib/unittest/mock.pyi index baf025bdeb5a..8e96b23ce959 100644 --- a/mypy/typeshed/stdlib/unittest/mock.pyi +++ b/mypy/typeshed/stdlib/unittest/mock.pyi @@ -318,7 +318,7 @@ class _patcher: # Ideally we'd be able to add an overload for it so that the return type is _patch[MagicMock], # but that's impossible with the current type system. @overload - def __call__( # type: ignore[misc] + def __call__( # type: ignore[overload-overlap] self, target: str, new: _T, @@ -343,7 +343,7 @@ class _patcher: ) -> _patch_default_new: ... @overload @staticmethod - def object( # type: ignore[misc] + def object( target: Any, attribute: str, new: _T, diff --git a/mypy/typeshed/stdlib/unittest/util.pyi b/mypy/typeshed/stdlib/unittest/util.pyi index 845accfebedd..c42d1346e4b7 100644 --- a/mypy/typeshed/stdlib/unittest/util.pyi +++ b/mypy/typeshed/stdlib/unittest/util.pyi @@ -1,4 +1,4 @@ -from collections.abc import Sequence +from collections.abc import MutableSequence, Sequence from typing import Any, TypeVar from typing_extensions import TypeAlias @@ -17,7 +17,7 @@ def _common_shorten_repr(*args: str) -> tuple[str, ...]: ... def safe_repr(obj: object, short: bool = False) -> str: ... def strclass(cls: type) -> str: ... def sorted_list_difference(expected: Sequence[_T], actual: Sequence[_T]) -> tuple[list[_T], list[_T]]: ... -def unorderable_list_difference(expected: Sequence[_T], actual: Sequence[_T]) -> tuple[list[_T], list[_T]]: ... +def unorderable_list_difference(expected: MutableSequence[_T], actual: MutableSequence[_T]) -> tuple[list[_T], list[_T]]: ... def three_way_cmp(x: Any, y: Any) -> int: ... def _count_diff_all_purpose(actual: Sequence[_T], expected: Sequence[_T]) -> list[_Mismatch[_T]]: ... def _count_diff_hashable(actual: Sequence[_T], expected: Sequence[_T]) -> list[_Mismatch[_T]]: ... diff --git a/mypy/typeshed/stdlib/urllib/request.pyi b/mypy/typeshed/stdlib/urllib/request.pyi index a4849dfa2e6e..ca3feaea262a 100644 --- a/mypy/typeshed/stdlib/urllib/request.pyi +++ b/mypy/typeshed/stdlib/urllib/request.pyi @@ -227,7 +227,8 @@ class ProxyDigestAuthHandler(BaseHandler, AbstractDigestAuthHandler): class _HTTPConnectionProtocol(Protocol): def __call__( self, - host: str, + __host: str, + *, port: int | None = ..., timeout: float = ..., source_address: tuple[str, int] | None = ..., @@ -336,6 +337,7 @@ class URLopener: def open_https(self, url: str, data: ReadableBuffer | None = None) -> _UrlopenRet: ... # undocumented def open_local_file(self, url: str) -> addinfourl: ... # undocumented def open_unknown_proxy(self, proxy: str, fullurl: str, data: ReadableBuffer | None = None) -> None: ... # undocumented + def __del__(self) -> None: ... class FancyURLopener(URLopener): def prompt_user_passwd(self, host: str, realm: str) -> tuple[str, str]: ... diff --git a/mypy/typeshed/stdlib/wave.pyi b/mypy/typeshed/stdlib/wave.pyi index 0d004d6b2d8a..6b7af2f79da1 100644 --- a/mypy/typeshed/stdlib/wave.pyi +++ b/mypy/typeshed/stdlib/wave.pyi @@ -1,7 +1,7 @@ import sys from _typeshed import ReadableBuffer, Unused from typing import IO, Any, BinaryIO, NamedTuple, NoReturn, overload -from typing_extensions import Literal, Self, TypeAlias +from typing_extensions import Literal, Self, TypeAlias, deprecated if sys.version_info >= (3, 9): __all__ = ["open", "Error", "Wave_read", "Wave_write"] @@ -26,6 +26,7 @@ class Wave_read: def __init__(self, f: _File) -> None: ... def __enter__(self) -> Self: ... def __exit__(self, *args: Unused) -> None: ... + def __del__(self) -> None: ... def getfp(self) -> BinaryIO | None: ... def rewind(self) -> None: ... def close(self) -> None: ... @@ -37,7 +38,9 @@ class Wave_read: def getcomptype(self) -> str: ... def getcompname(self) -> str: ... def getparams(self) -> _wave_params: ... + @deprecated("Deprecated in Python 3.13; removal scheduled for Python 3.15") def getmarkers(self) -> None: ... + @deprecated("Deprecated in Python 3.13; removal scheduled for Python 3.15") def getmark(self, id: Any) -> NoReturn: ... def setpos(self, pos: int) -> None: ... def readframes(self, nframes: int) -> bytes: ... @@ -46,6 +49,7 @@ class Wave_write: def __init__(self, f: _File) -> None: ... def __enter__(self) -> Self: ... def __exit__(self, *args: Unused) -> None: ... + def __del__(self) -> None: ... def setnchannels(self, nchannels: int) -> None: ... def getnchannels(self) -> int: ... def setsampwidth(self, sampwidth: int) -> None: ... @@ -59,8 +63,11 @@ class Wave_write: def getcompname(self) -> str: ... def setparams(self, params: _wave_params | tuple[int, int, int, int, str, str]) -> None: ... def getparams(self) -> _wave_params: ... + @deprecated("Deprecated in Python 3.13; removal scheduled for Python 3.15") def setmark(self, id: Any, pos: Any, name: Any) -> NoReturn: ... + @deprecated("Deprecated in Python 3.13; removal scheduled for Python 3.15") def getmark(self, id: Any) -> NoReturn: ... + @deprecated("Deprecated in Python 3.13; removal scheduled for Python 3.15") def getmarkers(self) -> None: ... def tell(self) -> int: ... def writeframesraw(self, data: ReadableBuffer) -> None: ... diff --git a/mypy/typeshed/stdlib/weakref.pyi b/mypy/typeshed/stdlib/weakref.pyi index ca5366602ceb..1bb2eacfb46a 100644 --- a/mypy/typeshed/stdlib/weakref.pyi +++ b/mypy/typeshed/stdlib/weakref.pyi @@ -40,7 +40,7 @@ _P = ParamSpec("_P") ProxyTypes: tuple[type[Any], ...] -class WeakMethod(ref[_CallableT], Generic[_CallableT]): +class WeakMethod(ref[_CallableT]): def __new__(cls, meth: _CallableT, callback: Callable[[Self], object] | None = None) -> Self: ... def __call__(self) -> _CallableT | None: ... def __eq__(self, other: object) -> bool: ... @@ -75,7 +75,7 @@ class WeakValueDictionary(MutableMapping[_KT, _VT]): def items(self) -> Iterator[tuple[_KT, _VT]]: ... # type: ignore[override] def itervaluerefs(self) -> Iterator[KeyedRef[_KT, _VT]]: ... def valuerefs(self) -> list[KeyedRef[_KT, _VT]]: ... - def setdefault(self, key: _KT, default: _VT) -> _VT: ... # type: ignore[override] + def setdefault(self, key: _KT, default: _VT) -> _VT: ... @overload def pop(self, key: _KT) -> _VT: ... @overload diff --git a/mypy/typeshed/stdlib/xml/etree/ElementTree.pyi b/mypy/typeshed/stdlib/xml/etree/ElementTree.pyi index d8ff2f5b6090..b08ca88e7e97 100644 --- a/mypy/typeshed/stdlib/xml/etree/ElementTree.pyi +++ b/mypy/typeshed/stdlib/xml/etree/ElementTree.pyi @@ -86,7 +86,7 @@ class Element: attrib: dict[str, str] text: str | None tail: str | None - def __init__(self, tag: str | Callable[..., Element], attrib: dict[str, str] = ..., **extra: str) -> None: ... + def __init__(self, tag: str, attrib: dict[str, str] = ..., **extra: str) -> None: ... def append(self, __subelement: Element) -> None: ... def clear(self) -> None: ... def extend(self, __elements: Iterable[Element]) -> None: ... @@ -132,7 +132,7 @@ def SubElement(parent: Element, tag: str, attrib: dict[str, str] = ..., **extra: def Comment(text: str | None = None) -> Element: ... def ProcessingInstruction(target: str, text: str | None = None) -> Element: ... -PI: Callable[..., Element] +PI = ProcessingInstruction class QName: text: str diff --git a/mypy/typeshed/stdlib/xml/sax/saxutils.pyi b/mypy/typeshed/stdlib/xml/sax/saxutils.pyi index 06e03a1e4d06..528f35963947 100644 --- a/mypy/typeshed/stdlib/xml/sax/saxutils.pyi +++ b/mypy/typeshed/stdlib/xml/sax/saxutils.pyi @@ -11,7 +11,7 @@ def quoteattr(data: str, entities: Mapping[str, str] = {}) -> str: ... class XMLGenerator(handler.ContentHandler): def __init__( self, - out: TextIOBase | RawIOBase | StreamWriter | StreamReaderWriter | SupportsWrite[str] | None = None, + out: TextIOBase | RawIOBase | StreamWriter | StreamReaderWriter | SupportsWrite[bytes] | None = None, encoding: str = "iso-8859-1", short_empty_elements: bool = False, ) -> None: ... diff --git a/mypy/typeshed/stdlib/xml/sax/xmlreader.pyi b/mypy/typeshed/stdlib/xml/sax/xmlreader.pyi index 74d2efb010cd..2ccbc95bbef0 100644 --- a/mypy/typeshed/stdlib/xml/sax/xmlreader.pyi +++ b/mypy/typeshed/stdlib/xml/sax/xmlreader.pyi @@ -82,6 +82,6 @@ class AttributesNSImpl(AttributesImpl): def __contains__(self, name: _NSName) -> bool: ... # type: ignore[override] @overload # type: ignore[override] def get(self, name: _NSName, alternative: None = None) -> str | None: ... - @overload # type: ignore[override] + @overload def get(self, name: _NSName, alternative: str) -> str: ... def items(self) -> list[tuple[_NSName, str]]: ... # type: ignore[override] diff --git a/mypy/typeshed/stdlib/zipfile.pyi b/mypy/typeshed/stdlib/zipfile.pyi index b7144f3ab528..5483b84fe6f6 100644 --- a/mypy/typeshed/stdlib/zipfile.pyi +++ b/mypy/typeshed/stdlib/zipfile.pyi @@ -26,7 +26,9 @@ __all__ = [ if sys.version_info >= (3, 8): __all__ += ["Path"] -_DateTuple: TypeAlias = tuple[int, int, int, int, int, int] +# TODO: use TypeAlias when mypy bugs are fixed +# https://github.com/python/mypy/issues/16581 +_DateTuple = tuple[int, int, int, int, int, int] # noqa: Y026 _ReadWriteMode: TypeAlias = Literal["r", "w"] _ReadWriteBinaryMode: TypeAlias = Literal["r", "w", "rb", "wb"] _ZipFileMode: TypeAlias = Literal["r", "w", "x", "a"] @@ -187,6 +189,8 @@ class ZipFile: if sys.version_info >= (3, 11): def mkdir(self, zinfo_or_directory_name: str | ZipInfo, mode: int = 0o777) -> None: ... + def __del__(self) -> None: ... + class PyZipFile(ZipFile): def __init__( self, file: str | IO[bytes], mode: _ZipFileMode = "r", compression: int = 0, allowZip64: bool = True, optimize: int = -1 @@ -231,7 +235,7 @@ if sys.version_info >= (3, 8): def make(cls, source: ZipFile) -> CompleteDirs: ... @overload @classmethod - def make(cls: type[Self], source: StrPath | IO[bytes]) -> Self: ... + def make(cls, source: StrPath | IO[bytes]) -> Self: ... class Path: root: CompleteDirs diff --git a/mypy/util.py b/mypy/util.py index d0f2f8c6cc36..7a13de427e8e 100644 --- a/mypy/util.py +++ b/mypy/util.py @@ -234,45 +234,85 @@ def get_mypy_comments(source: str) -> list[tuple[int, str]]: return results -PASS_TEMPLATE: Final = """ - - - - +JUNIT_HEADER_TEMPLATE: Final = """ + """ -FAIL_TEMPLATE: Final = """ - - +JUNIT_TESTCASE_FAIL_TEMPLATE: Final = """ {text} - """ -ERROR_TEMPLATE: Final = """ - - +JUNIT_ERROR_TEMPLATE: Final = """ {text} - """ +JUNIT_TESTCASE_PASS_TEMPLATE: Final = """ + +""" -def write_junit_xml( - dt: float, serious: bool, messages: list[str], path: str, version: str, platform: str -) -> None: - from xml.sax.saxutils import escape +JUNIT_FOOTER: Final = """ +""" - if not messages and not serious: - xml = PASS_TEMPLATE.format(time=dt, ver=version, platform=platform) - elif not serious: - xml = FAIL_TEMPLATE.format( - text=escape("\n".join(messages)), time=dt, ver=version, platform=platform - ) + +def _generate_junit_contents( + dt: float, + serious: bool, + messages_by_file: dict[str | None, list[str]], + version: str, + platform: str, +) -> str: + if serious: + failures = 0 + errors = len(messages_by_file) else: - xml = ERROR_TEMPLATE.format( - text=escape("\n".join(messages)), time=dt, ver=version, platform=platform - ) + failures = len(messages_by_file) + errors = 0 + + xml = JUNIT_HEADER_TEMPLATE.format( + errors=errors, + failures=failures, + time=dt, + # If there are no messages, we still write one "test" indicating success. + tests=len(messages_by_file) or 1, + ) + + if not messages_by_file: + xml += JUNIT_TESTCASE_PASS_TEMPLATE.format(time=dt, ver=version, platform=platform) + else: + for filename, messages in messages_by_file.items(): + if filename is not None: + xml += JUNIT_TESTCASE_FAIL_TEMPLATE.format( + text="\n".join(messages), + filename=filename, + time=dt, + name="mypy-py{ver}-{platform} {filename}".format( + ver=version, platform=platform, filename=filename + ), + ) + else: + xml += JUNIT_TESTCASE_FAIL_TEMPLATE.format( + text="\n".join(messages), + filename="mypy", + time=dt, + name="mypy-py{ver}-{platform}".format(ver=version, platform=platform), + ) + + xml += JUNIT_FOOTER + + return xml + + +def write_junit_xml( + dt: float, + serious: bool, + messages_by_file: dict[str | None, list[str]], + path: str, + version: str, + platform: str, +) -> None: + xml = _generate_junit_contents(dt, serious, messages_by_file, version, platform) # checks for a directory structure in path and creates folders if needed xml_dirs = os.path.dirname(os.path.abspath(path)) diff --git a/mypy/version.py b/mypy/version.py index 7cfc68d6e553..900fee26d80c 100644 --- a/mypy/version.py +++ b/mypy/version.py @@ -8,7 +8,7 @@ # - Release versions have the form "1.2.3". # - Dev versions have the form "1.2.3+dev" (PLUS sign to conform to PEP 440). # - Before 1.0 we had the form "0.NNN". -__version__ = "1.7.0+dev" +__version__ = "1.8.0" base_version = __version__ mypy_dir = os.path.abspath(os.path.dirname(os.path.dirname(__file__))) diff --git a/mypyc/build.py b/mypyc/build.py index 0af8908e14d0..485803acba46 100644 --- a/mypyc/build.py +++ b/mypyc/build.py @@ -105,7 +105,14 @@ def emit_messages(options: Options, messages: list[str], dt: float, serious: boo # ... you know, just in case. if options.junit_xml: py_version = f"{options.python_version[0]}_{options.python_version[1]}" - write_junit_xml(dt, serious, messages, options.junit_xml, py_version, options.platform) + write_junit_xml( + dt, + serious, + {None: messages} if messages else {}, + options.junit_xml, + py_version, + options.platform, + ) if messages: print("\n".join(messages)) diff --git a/mypyc/irbuild/prebuildvisitor.py b/mypyc/irbuild/prebuildvisitor.py index 519b3445e925..17f907d42111 100644 --- a/mypyc/irbuild/prebuildvisitor.py +++ b/mypyc/irbuild/prebuildvisitor.py @@ -119,9 +119,10 @@ def visit_decorator(self, dec: Decorator) -> None: self.funcs_to_decorators[dec.func] = decorators_to_store super().visit_decorator(dec) - def visit_func_def(self, fdef: FuncItem) -> None: + def visit_func_def(self, fdef: FuncDef) -> None: # TODO: What about overloaded functions? self.visit_func(fdef) + self.visit_symbol_node(fdef) def visit_lambda_expr(self, expr: LambdaExpr) -> None: self.visit_func(expr) diff --git a/mypyc/test-data/run-functions.test b/mypyc/test-data/run-functions.test index bd8f1a9197dd..cf519f30dad8 100644 --- a/mypyc/test-data/run-functions.test +++ b/mypyc/test-data/run-functions.test @@ -1286,3 +1286,25 @@ def bar() -> None: bar() [out] {'__module__': 'native', '__name__': 'bar', '__qualname__': 'bar', '__doc__': None, '__wrapped__': } + +[case testCallNestedFunctionWithNamed] +def f() -> None: + def a() -> None: + pass + def b() -> None: + a() + b() +[file driver.py] +from native import f +f() + +[case testCallNestedFunctionWithLambda] +def f(x: int) -> int: + def inc(x: int) -> int: + return x + 1 + return (lambda x: inc(x))(1) +[file driver.py] +from native import f +print(f(1)) +[out] +2 diff --git a/mypyc/test/test_run.py b/mypyc/test/test_run.py index df9d44eab73f..f5c902bf3b3d 100644 --- a/mypyc/test/test_run.py +++ b/mypyc/test/test_run.py @@ -172,12 +172,10 @@ def run_case_inner(self, testcase: DataDrivenTestCase) -> None: # new by distutils, shift the mtime of all of the # generated artifacts back by a second. fudge_dir_mtimes(WORKDIR, -1) - # On Ubuntu, changing the mtime doesn't work reliably. As + # On some OS, changing the mtime doesn't work reliably. As # a workaround, sleep. - # # TODO: Figure out a better approach, since this slows down tests. - if sys.platform == "linux": - time.sleep(1.0) + time.sleep(1.0) step += 1 with chdir_manager(".."): diff --git a/test-data/packages/typedpkg-stubs/pyproject.toml b/test-data/packages/typedpkg-stubs/pyproject.toml index 125816151ef8..c984c5d91e0a 100644 --- a/test-data/packages/typedpkg-stubs/pyproject.toml +++ b/test-data/packages/typedpkg-stubs/pyproject.toml @@ -7,5 +7,5 @@ description = 'test' include = ["**/*.pyi"] [build-system] -requires = ["hatchling"] +requires = ["hatchling==1.18"] build-backend = "hatchling.build" diff --git a/test-data/packages/typedpkg/pyproject.toml b/test-data/packages/typedpkg/pyproject.toml index 5269c94320e1..6b55d4b3df60 100644 --- a/test-data/packages/typedpkg/pyproject.toml +++ b/test-data/packages/typedpkg/pyproject.toml @@ -4,5 +4,5 @@ version = '0.1' description = 'test' [build-system] -requires = ["hatchling"] +requires = ["hatchling==1.18"] build-backend = "hatchling.build" diff --git a/test-data/packages/typedpkg_ns_a/pyproject.toml b/test-data/packages/typedpkg_ns_a/pyproject.toml index cc464af75b17..f41ad16b5bc2 100644 --- a/test-data/packages/typedpkg_ns_a/pyproject.toml +++ b/test-data/packages/typedpkg_ns_a/pyproject.toml @@ -7,5 +7,5 @@ description = 'test' include = ["**/*.py", "**/*.pyi", "**/py.typed"] [build-system] -requires = ["hatchling"] +requires = ["hatchling==1.18"] build-backend = "hatchling.build" diff --git a/test-data/packages/typedpkg_ns_b-stubs/pyproject.toml b/test-data/packages/typedpkg_ns_b-stubs/pyproject.toml index d5275d1ed8b3..2c1c206c361d 100644 --- a/test-data/packages/typedpkg_ns_b-stubs/pyproject.toml +++ b/test-data/packages/typedpkg_ns_b-stubs/pyproject.toml @@ -7,5 +7,5 @@ description = 'test' include = ["**/*.pyi"] [build-system] -requires = ["hatchling"] +requires = ["hatchling==1.18"] build-backend = "hatchling.build" diff --git a/test-data/packages/typedpkg_ns_b/pyproject.toml b/test-data/packages/typedpkg_ns_b/pyproject.toml index 8567af11152e..b8ae0d59072e 100644 --- a/test-data/packages/typedpkg_ns_b/pyproject.toml +++ b/test-data/packages/typedpkg_ns_b/pyproject.toml @@ -4,5 +4,5 @@ version = '0.1' description = 'test' [build-system] -requires = ["hatchling"] +requires = ["hatchling==1.18"] build-backend = "hatchling.build" diff --git a/test-data/pybind11_mypy_demo/src/main.cpp b/test-data/pybind11_mypy_demo/src/main.cpp index 00e5b2f4e871..192a90cf8e30 100644 --- a/test-data/pybind11_mypy_demo/src/main.cpp +++ b/test-data/pybind11_mypy_demo/src/main.cpp @@ -44,6 +44,7 @@ #include #include +#include namespace py = pybind11; @@ -102,6 +103,11 @@ struct Point { return distance_to(other.x, other.y); } + std::vector as_vector() + { + return std::vector{x, y}; + } + double x, y; }; @@ -134,14 +140,15 @@ void bind_basics(py::module& basics) { .def(py::init(), py::arg("x"), py::arg("y")) .def("distance_to", py::overload_cast(&Point::distance_to, py::const_), py::arg("x"), py::arg("y")) .def("distance_to", py::overload_cast(&Point::distance_to, py::const_), py::arg("other")) - .def_readwrite("x", &Point::x) + .def("as_list", &Point::as_vector) + .def_readwrite("x", &Point::x, "some docstring") .def_property("y", [](Point& self){ return self.y; }, [](Point& self, double value){ self.y = value; } ) .def_property_readonly("length", &Point::length) .def_property_readonly_static("x_axis", [](py::object cls){return Point::x_axis;}) - .def_property_readonly_static("y_axis", [](py::object cls){return Point::y_axis;}) + .def_property_readonly_static("y_axis", [](py::object cls){return Point::y_axis;}, "another docstring") .def_readwrite_static("length_unit", &Point::length_unit) .def_property_static("angle_unit", [](py::object& /*cls*/){ return Point::angle_unit; }, diff --git a/test-data/pybind11_mypy_demo/stubgen-include-docs/pybind11_mypy_demo/__init__.pyi b/test-data/pybind11_mypy_demo/stubgen-include-docs/pybind11_mypy_demo/__init__.pyi index e69de29bb2d1..0cb252f00259 100644 --- a/test-data/pybind11_mypy_demo/stubgen-include-docs/pybind11_mypy_demo/__init__.pyi +++ b/test-data/pybind11_mypy_demo/stubgen-include-docs/pybind11_mypy_demo/__init__.pyi @@ -0,0 +1 @@ +from . import basics as basics diff --git a/test-data/pybind11_mypy_demo/stubgen-include-docs/pybind11_mypy_demo/basics.pyi b/test-data/pybind11_mypy_demo/stubgen-include-docs/pybind11_mypy_demo/basics.pyi index 676d7f6d3f15..b761291e11f3 100644 --- a/test-data/pybind11_mypy_demo/stubgen-include-docs/pybind11_mypy_demo/basics.pyi +++ b/test-data/pybind11_mypy_demo/stubgen-include-docs/pybind11_mypy_demo/basics.pyi @@ -1,7 +1,7 @@ -from typing import ClassVar +from typing import ClassVar, List, overload -from typing import overload PI: float +__version__: str class Point: class AngleUnit: @@ -13,8 +13,6 @@ class Point: """__init__(self: pybind11_mypy_demo.basics.Point.AngleUnit, value: int) -> None""" def __eq__(self, other: object) -> bool: """__eq__(self: object, other: object) -> bool""" - def __getstate__(self) -> int: - """__getstate__(self: object) -> int""" def __hash__(self) -> int: """__hash__(self: object) -> int""" def __index__(self) -> int: @@ -23,8 +21,6 @@ class Point: """__int__(self: pybind11_mypy_demo.basics.Point.AngleUnit) -> int""" def __ne__(self, other: object) -> bool: """__ne__(self: object, other: object) -> bool""" - def __setstate__(self, state: int) -> None: - """__setstate__(self: pybind11_mypy_demo.basics.Point.AngleUnit, state: int) -> None""" @property def name(self) -> str: ... @property @@ -40,8 +36,6 @@ class Point: """__init__(self: pybind11_mypy_demo.basics.Point.LengthUnit, value: int) -> None""" def __eq__(self, other: object) -> bool: """__eq__(self: object, other: object) -> bool""" - def __getstate__(self) -> int: - """__getstate__(self: object) -> int""" def __hash__(self) -> int: """__hash__(self: object) -> int""" def __index__(self) -> int: @@ -50,8 +44,6 @@ class Point: """__int__(self: pybind11_mypy_demo.basics.Point.LengthUnit) -> int""" def __ne__(self, other: object) -> bool: """__ne__(self: object, other: object) -> bool""" - def __setstate__(self, state: int) -> None: - """__setstate__(self: pybind11_mypy_demo.basics.Point.LengthUnit, state: int) -> None""" @property def name(self) -> str: ... @property @@ -70,7 +62,8 @@ class Point: 1. __init__(self: pybind11_mypy_demo.basics.Point) -> None - 2. __init__(self: pybind11_mypy_demo.basics.Point, x: float, y: float) -> None""" + 2. __init__(self: pybind11_mypy_demo.basics.Point, x: float, y: float) -> None + """ @overload def __init__(self, x: float, y: float) -> None: """__init__(*args, **kwargs) @@ -78,7 +71,10 @@ class Point: 1. __init__(self: pybind11_mypy_demo.basics.Point) -> None - 2. __init__(self: pybind11_mypy_demo.basics.Point, x: float, y: float) -> None""" + 2. __init__(self: pybind11_mypy_demo.basics.Point, x: float, y: float) -> None + """ + def as_list(self) -> List[float]: + """as_list(self: pybind11_mypy_demo.basics.Point) -> List[float]""" @overload def distance_to(self, x: float, y: float) -> float: """distance_to(*args, **kwargs) @@ -86,7 +82,8 @@ class Point: 1. distance_to(self: pybind11_mypy_demo.basics.Point, x: float, y: float) -> float - 2. distance_to(self: pybind11_mypy_demo.basics.Point, other: pybind11_mypy_demo.basics.Point) -> float""" + 2. distance_to(self: pybind11_mypy_demo.basics.Point, other: pybind11_mypy_demo.basics.Point) -> float + """ @overload def distance_to(self, other: Point) -> float: """distance_to(*args, **kwargs) @@ -94,19 +91,22 @@ class Point: 1. distance_to(self: pybind11_mypy_demo.basics.Point, x: float, y: float) -> float - 2. distance_to(self: pybind11_mypy_demo.basics.Point, other: pybind11_mypy_demo.basics.Point) -> float""" + 2. distance_to(self: pybind11_mypy_demo.basics.Point, other: pybind11_mypy_demo.basics.Point) -> float + """ @property def length(self) -> float: ... def answer() -> int: '''answer() -> int - answer docstring, with end quote"''' + answer docstring, with end quote" + ''' def midpoint(left: float, right: float) -> float: """midpoint(left: float, right: float) -> float""" def sum(arg0: int, arg1: int) -> int: '''sum(arg0: int, arg1: int) -> int - multiline docstring test, edge case quotes """\'\'\'''' + multiline docstring test, edge case quotes """\'\'\' + ''' def weighted_midpoint(left: float, right: float, alpha: float = ...) -> float: """weighted_midpoint(left: float, right: float, alpha: float = 0.5) -> float""" diff --git a/test-data/pybind11_mypy_demo/stubgen/pybind11_mypy_demo/basics.pyi b/test-data/pybind11_mypy_demo/stubgen/pybind11_mypy_demo/basics.pyi index 6527f5733eaf..6f164a03edcc 100644 --- a/test-data/pybind11_mypy_demo/stubgen/pybind11_mypy_demo/basics.pyi +++ b/test-data/pybind11_mypy_demo/stubgen/pybind11_mypy_demo/basics.pyi @@ -1,4 +1,4 @@ -from typing import ClassVar, overload +from typing import ClassVar, List, overload PI: float __version__: str @@ -47,6 +47,7 @@ class Point: def __init__(self) -> None: ... @overload def __init__(self, x: float, y: float) -> None: ... + def as_list(self) -> List[float]: ... @overload def distance_to(self, x: float, y: float) -> float: ... @overload diff --git a/test-data/unit/check-class-namedtuple.test b/test-data/unit/check-class-namedtuple.test index a095f212b900..f334b9011645 100644 --- a/test-data/unit/check-class-namedtuple.test +++ b/test-data/unit/check-class-namedtuple.test @@ -301,7 +301,7 @@ reveal_type(X._field_defaults) # N: Revealed type is "builtins.dict[builtins.st # but it's inferred as `Mapping[str, object]` here due to the fixture we're using reveal_type(X.__annotations__) # N: Revealed type is "typing.Mapping[builtins.str, builtins.object]" -[builtins fixtures/dict.pyi] +[builtins fixtures/dict-full.pyi] [case testNewNamedTupleUnit] from typing import NamedTuple diff --git a/test-data/unit/check-dataclasses.test b/test-data/unit/check-dataclasses.test index d37ae569cc5e..107298875761 100644 --- a/test-data/unit/check-dataclasses.test +++ b/test-data/unit/check-dataclasses.test @@ -2531,3 +2531,16 @@ class Foo: c: int # E: Name "c" already defined on line 5 [builtins fixtures/dataclasses.pyi] + +[case testDataclassInheritanceWorksWithExplicitOverrides] +# flags: --enable-error-code explicit-override +from dataclasses import dataclass + +@dataclass +class Base: + x: int + +@dataclass +class Child(Base): + y: int +[builtins fixtures/dataclasses.pyi] diff --git a/test-data/unit/check-errorcodes.test b/test-data/unit/check-errorcodes.test index 2282f21bcfa6..1dd058730f28 100644 --- a/test-data/unit/check-errorcodes.test +++ b/test-data/unit/check-errorcodes.test @@ -975,11 +975,13 @@ def f(d: D, s: str) -> None: [typing fixtures/typing-typeddict.pyi] [case testRecommendErrorCode] -# type: ignore[whatever] # E: type ignore with error code is not supported for modules; use `# mypy: disable-error-code="whatever"` [syntax] +# type: ignore[whatever] # E: type ignore with error code is not supported for modules; use `# mypy: disable-error-code="whatever"` [syntax] \ + # N: Error code "syntax" not covered by "type: ignore" comment 1 + "asdf" [case testRecommendErrorCode2] -# type: ignore[whatever, other] # E: type ignore with error code is not supported for modules; use `# mypy: disable-error-code="whatever, other"` [syntax] +# type: ignore[whatever, other] # E: type ignore with error code is not supported for modules; use `# mypy: disable-error-code="whatever, other"` [syntax] \ + # N: Error code "syntax" not covered by "type: ignore" comment 1 + "asdf" [case testShowErrorCodesInConfig] @@ -1148,3 +1150,35 @@ main:3: note: Revealed local types are: main:3: note: x: builtins.int main:3: error: Name "reveal_locals" is not defined [unimported-reveal] [builtins fixtures/isinstancelist.pyi] + +[case testCovariantMutableOverride] +# flags: --enable-error-code=mutable-override +from typing import Any + +class C: + x: float + y: float + z: float + w: Any + @property + def foo(self) -> float: ... + @property + def bar(self) -> float: ... + @bar.setter + def bar(self, val: float) -> None: ... + baz: float + bad1: float + bad2: float +class D(C): + x: int # E: Covariant override of a mutable attribute (base class "C" defined the type as "float", expression has type "int") [mutable-override] + y: float + z: Any + w: float + foo: int + bar: int # E: Covariant override of a mutable attribute (base class "C" defined the type as "float", expression has type "int") [mutable-override] + def one(self) -> None: + self.baz = 5 + bad1 = 5 # E: Covariant override of a mutable attribute (base class "C" defined the type as "float", expression has type "int") [mutable-override] + def other(self) -> None: + self.bad2: int = 5 # E: Covariant override of a mutable attribute (base class "C" defined the type as "float", expression has type "int") [mutable-override] +[builtins fixtures/property.pyi] diff --git a/test-data/unit/check-expressions.test b/test-data/unit/check-expressions.test index 4ac5512580d2..04b3f7a131cc 100644 --- a/test-data/unit/check-expressions.test +++ b/test-data/unit/check-expressions.test @@ -1589,8 +1589,7 @@ if str(): ....a # E: "ellipsis" has no attribute "a" class A: pass -[builtins fixtures/dict.pyi] -[out] +[builtins fixtures/dict-full.pyi] -- Yield expression @@ -2378,6 +2377,38 @@ assert a == b [builtins fixtures/dict.pyi] [typing fixtures/typing-full.pyi] +[case testStrictEqualityWithRecursiveMapTypes] +# flags: --strict-equality +from typing import Dict + +R = Dict[str, R] + +a: R +b: R +assert a == b + +R2 = Dict[int, R2] +c: R2 +assert a == c # E: Non-overlapping equality check (left operand type: "Dict[str, R]", right operand type: "Dict[int, R2]") +[builtins fixtures/dict.pyi] +[typing fixtures/typing-full.pyi] + +[case testStrictEqualityWithRecursiveListTypes] +# flags: --strict-equality +from typing import List, Union + +R = List[Union[str, R]] + +a: R +b: R +assert a == b + +R2 = List[Union[int, R2]] +c: R2 +assert a == c +[builtins fixtures/list.pyi] +[typing fixtures/typing-full.pyi] + [case testUnimportedHintAny] def f(x: Any) -> None: # E: Name "Any" is not defined \ # N: Did you forget to import it from "typing"? (Suggestion: "from typing import Any") diff --git a/test-data/unit/check-final.test b/test-data/unit/check-final.test index da034caced76..b1378a47b1b1 100644 --- a/test-data/unit/check-final.test +++ b/test-data/unit/check-final.test @@ -1117,3 +1117,60 @@ from typing import Final class MyClass: a: None a: Final[int] = 1 # E: Cannot redefine an existing name as final # E: Name "a" already defined on line 5 + +[case testFinalOverrideAllowedForPrivate] +from typing import Final, final + +class Parent: + __foo: Final[int] = 0 + @final + def __bar(self) -> None: ... + +class Child(Parent): + __foo: Final[int] = 1 + @final + def __bar(self) -> None: ... + +[case testFinalWithoutBool] +from typing_extensions import final, Literal + +class A: + pass + +@final +class B: + pass + +@final +class C: + def __len__(self) -> Literal[1]: return 1 + +reveal_type(A() and 42) # N: Revealed type is "Union[__main__.A, Literal[42]?]" +reveal_type(B() and 42) # N: Revealed type is "Literal[42]?" +reveal_type(C() and 42) # N: Revealed type is "Literal[42]?" + +[builtins fixtures/bool.pyi] + +[case testFinalWithoutBoolButWithLen] +from typing_extensions import final, Literal + +# Per Python data model, __len__ is called if __bool__ does not exist. +# In a @final class, __bool__ would not exist. + +@final +class A: + def __len__(self) -> int: ... + +@final +class B: + def __len__(self) -> Literal[1]: return 1 + +@final +class C: + def __len__(self) -> Literal[0]: return 0 + +reveal_type(A() and 42) # N: Revealed type is "Union[__main__.A, Literal[42]?]" +reveal_type(B() and 42) # N: Revealed type is "Literal[42]?" +reveal_type(C() and 42) # N: Revealed type is "__main__.C" + +[builtins fixtures/bool.pyi] diff --git a/test-data/unit/check-functions.test b/test-data/unit/check-functions.test index cd098a84d4d3..b3df5fddafba 100644 --- a/test-data/unit/check-functions.test +++ b/test-data/unit/check-functions.test @@ -3159,6 +3159,16 @@ class D(A, B): def f(self, z: int) -> str: pass # E: Method "f" is not using @override but is overriding a method in class "__main__.A" [typing fixtures/typing-override.pyi] +[case testExplicitOverrideAllowedForPrivate] +# flags: --enable-error-code explicit-override --python-version 3.12 + +class B: + def __f(self, y: int) -> str: pass + +class C(B): + def __f(self, y: int) -> str: pass # OK +[typing fixtures/typing-override.pyi] + [case testCallableProperty] from typing import Callable diff --git a/test-data/unit/check-incremental.test b/test-data/unit/check-incremental.test index 806a585bff39..2c7d908c5f5b 100644 --- a/test-data/unit/check-incremental.test +++ b/test-data/unit/check-incremental.test @@ -6560,3 +6560,26 @@ class C: [out] [out2] tmp/a.py:3: note: Revealed type is "TypedDict('b.C.Hidden@5', {'x': builtins.int})" + +[case testNoIncrementalCrashOnInvalidEnumMethod] +import a +[file a.py] +from lib import TheClass +[file a.py.2] +from lib import TheClass +x: TheClass +reveal_type(x.enum_type) +[file lib.py] +import enum + +class TheClass: + def __init__(self) -> None: + names = ["foo"] + pyenum = enum.Enum('Blah', { # type: ignore[misc] + x.upper(): x + for x in names + }) + self.enum_type = pyenum +[out] +[out2] +tmp/a.py:3: note: Revealed type is "def (value: builtins.object) -> lib.TheClass.pyenum@6" diff --git a/test-data/unit/check-inference-context.test b/test-data/unit/check-inference-context.test index a933acbf7f32..afe6548df2d4 100644 --- a/test-data/unit/check-inference-context.test +++ b/test-data/unit/check-inference-context.test @@ -1482,3 +1482,16 @@ b: Any i = i if isinstance(i, int) else b reveal_type(i) # N: Revealed type is "Union[Any, builtins.int]" [builtins fixtures/isinstance.pyi] + +[case testLambdaInferenceUsesNarrowedTypes] +from typing import Optional, Callable + +def f1(key: Callable[[], str]) -> None: ... +def f2(key: object) -> None: ... + +def g(b: Optional[str]) -> None: + if b: + f1(lambda: reveal_type(b)) # N: Revealed type is "builtins.str" + z: Callable[[], str] = lambda: reveal_type(b) # N: Revealed type is "builtins.str" + f2(lambda: reveal_type(b)) # N: Revealed type is "builtins.str" + lambda: reveal_type(b) # N: Revealed type is "builtins.str" diff --git a/test-data/unit/check-inference.test b/test-data/unit/check-inference.test index 0d162238450a..953855e502d6 100644 --- a/test-data/unit/check-inference.test +++ b/test-data/unit/check-inference.test @@ -3767,3 +3767,49 @@ def f(values: List[T]) -> T: ... x = foo(f([C()])) reveal_type(x) # N: Revealed type is "__main__.C" [builtins fixtures/list.pyi] + +[case testInferenceAgainstGenericCallableUnion] +from typing import Callable, TypeVar, List, Union + +T = TypeVar("T") +S = TypeVar("S") + +def dec(f: Callable[[S], T]) -> Callable[[S], List[T]]: ... +@dec +def func(arg: T) -> Union[T, str]: + ... +reveal_type(func) # N: Revealed type is "def [S] (S`1) -> builtins.list[Union[S`1, builtins.str]]" +reveal_type(func(42)) # N: Revealed type is "builtins.list[Union[builtins.int, builtins.str]]" + +def dec2(f: Callable[[S], List[T]]) -> Callable[[S], T]: ... +@dec2 +def func2(arg: T) -> List[Union[T, str]]: + ... +reveal_type(func2) # N: Revealed type is "def [S] (S`4) -> Union[S`4, builtins.str]" +reveal_type(func2(42)) # N: Revealed type is "Union[builtins.int, builtins.str]" +[builtins fixtures/list.pyi] + +[case testInferenceAgainstGenericCallbackProtoMultiple] +from typing import Callable, Protocol, TypeVar +from typing_extensions import Concatenate, ParamSpec + +V_co = TypeVar("V_co", covariant=True) +class Metric(Protocol[V_co]): + def __call__(self) -> V_co: ... + +T = TypeVar("T") +P = ParamSpec("P") +def simple_metric(func: Callable[Concatenate[int, P], T]) -> Callable[P, T]: ... + +@simple_metric +def Negate(count: int, /, metric: Metric[float]) -> float: ... +@simple_metric +def Combine(count: int, m1: Metric[T], m2: Metric[T], /, *more: Metric[T]) -> T: ... + +reveal_type(Negate) # N: Revealed type is "def (metric: __main__.Metric[builtins.float]) -> builtins.float" +reveal_type(Combine) # N: Revealed type is "def [T] (def () -> T`4, def () -> T`4, *more: def () -> T`4) -> T`4" + +def m1() -> float: ... +def m2() -> float: ... +reveal_type(Combine(m1, m2)) # N: Revealed type is "builtins.float" +[builtins fixtures/list.pyi] diff --git a/test-data/unit/check-namedtuple.test b/test-data/unit/check-namedtuple.test index 14e075339572..51b02b500bd1 100644 --- a/test-data/unit/check-namedtuple.test +++ b/test-data/unit/check-namedtuple.test @@ -1354,3 +1354,20 @@ class Test: self.item: self.Item # E: Name "self.Item" is not defined [builtins fixtures/tuple.pyi] [typing fixtures/typing-namedtuple.pyi] + +[case testNoClassKeywordsForNamedTuple] +from typing import NamedTuple +class Test1(NamedTuple, x=1, y=2): # E: Unexpected keyword argument "x" for "__init_subclass__" of "NamedTuple" \ + # E: Unexpected keyword argument "y" for "__init_subclass__" of "NamedTuple" + ... + +class Meta(type): ... + +class Test2(NamedTuple, metaclass=Meta): # E: Unexpected keyword argument "metaclass" for "__init_subclass__" of "NamedTuple" + ... + +# Technically this would work, but it is just easier for the implementation: +class Test3(NamedTuple, metaclass=type): # E: Unexpected keyword argument "metaclass" for "__init_subclass__" of "NamedTuple" + ... +[builtins fixtures/tuple.pyi] +[typing fixtures/typing-namedtuple.pyi] diff --git a/test-data/unit/check-narrowing.test b/test-data/unit/check-narrowing.test index 5b7fadf41c79..a2859dfffa3a 100644 --- a/test-data/unit/check-narrowing.test +++ b/test-data/unit/check-narrowing.test @@ -1020,6 +1020,105 @@ else: reveal_type(true_or_false) # N: Revealed type is "Literal[False]" [builtins fixtures/primitives.pyi] + +[case testNarrowingIsInstanceFinalSubclass] +# flags: --warn-unreachable + +from typing import final + +class N: ... +@final +class F1: ... +@final +class F2: ... + +n: N +f1: F1 + +if isinstance(f1, F1): + reveal_type(f1) # N: Revealed type is "__main__.F1" +else: + reveal_type(f1) # E: Statement is unreachable + +if isinstance(n, F1): # E: Subclass of "N" and "F1" cannot exist: "F1" is final + reveal_type(n) # E: Statement is unreachable +else: + reveal_type(n) # N: Revealed type is "__main__.N" + +if isinstance(f1, N): # E: Subclass of "F1" and "N" cannot exist: "F1" is final + reveal_type(f1) # E: Statement is unreachable +else: + reveal_type(f1) # N: Revealed type is "__main__.F1" + +if isinstance(f1, F2): # E: Subclass of "F1" and "F2" cannot exist: "F1" is final \ + # E: Subclass of "F1" and "F2" cannot exist: "F2" is final + reveal_type(f1) # E: Statement is unreachable +else: + reveal_type(f1) # N: Revealed type is "__main__.F1" +[builtins fixtures/isinstance.pyi] + + +[case testNarrowingIsInstanceFinalSubclassWithUnions] +# flags: --warn-unreachable + +from typing import final, Union + +class N: ... +@final +class F1: ... +@final +class F2: ... + +n_f1: Union[N, F1] +n_f2: Union[N, F2] +f1_f2: Union[F1, F2] + +if isinstance(n_f1, F1): + reveal_type(n_f1) # N: Revealed type is "__main__.F1" +else: + reveal_type(n_f1) # N: Revealed type is "__main__.N" + +if isinstance(n_f2, F1): # E: Subclass of "N" and "F1" cannot exist: "F1" is final \ + # E: Subclass of "F2" and "F1" cannot exist: "F2" is final \ + # E: Subclass of "F2" and "F1" cannot exist: "F1" is final + reveal_type(n_f2) # E: Statement is unreachable +else: + reveal_type(n_f2) # N: Revealed type is "Union[__main__.N, __main__.F2]" + +if isinstance(f1_f2, F1): + reveal_type(f1_f2) # N: Revealed type is "__main__.F1" +else: + reveal_type(f1_f2) # N: Revealed type is "__main__.F2" +[builtins fixtures/isinstance.pyi] + + +[case testNarrowingIsSubclassFinalSubclassWithTypeVar] +# flags: --warn-unreachable + +from typing import final, Type, TypeVar + +@final +class A: ... +@final +class B: ... + +T = TypeVar("T", A, B) + +def f(cls: Type[T]) -> T: + if issubclass(cls, A): + reveal_type(cls) # N: Revealed type is "Type[__main__.A]" + x: bool + if x: + return A() + else: + return B() # E: Incompatible return value type (got "B", expected "A") + assert False + +reveal_type(f(A)) # N: Revealed type is "__main__.A" +reveal_type(f(B)) # N: Revealed type is "__main__.B" +[builtins fixtures/isinstance.pyi] + + [case testNarrowingLiteralIdentityCheck] from typing import Union from typing_extensions import Literal @@ -1910,3 +2009,16 @@ if len(x) == a: else: reveal_type(x) # N: Revealed type is "Union[Tuple[builtins.int, builtins.int], Tuple[builtins.int, builtins.int, builtins.int]]" [builtins fixtures/len.pyi] + +[case testNarrowingLenUnionWithUnreachable] +from typing import Union, Sequence + +def f(x: Union[int, Sequence[int]]) -> None: + if ( + isinstance(x, tuple) + and len(x) == 2 + and isinstance(x[0], int) + and isinstance(x[1], int) + ): + reveal_type(x) # N: Revealed type is "Tuple[builtins.int, builtins.int]" +[builtins fixtures/len.pyi] diff --git a/test-data/unit/check-parameter-specification.test b/test-data/unit/check-parameter-specification.test index db8c76fd21e9..af2be84f5412 100644 --- a/test-data/unit/check-parameter-specification.test +++ b/test-data/unit/check-parameter-specification.test @@ -1687,9 +1687,18 @@ P = ParamSpec("P") T = TypeVar("T") def apply(fn: Callable[P, T], *args: P.args, **kwargs: P.kwargs) -> None: ... + def test(x: int) -> int: ... apply(apply, test, x=42) # OK apply(apply, test, 42) # Also OK (but requires some special casing) +apply(apply, test, "bad") # E: Argument 1 to "apply" has incompatible type "Callable[[Callable[P, T], **P], None]"; expected "Callable[[Callable[[int], int], str], None]" + +def test2(x: int, y: str) -> None: ... +apply(apply, test2, 42, "yes") +apply(apply, test2, "no", 42) # E: Argument 1 to "apply" has incompatible type "Callable[[Callable[P, T], **P], None]"; expected "Callable[[Callable[[int, str], None], str, int], None]" +apply(apply, test2, x=42, y="yes") +apply(apply, test2, y="yes", x=42) +apply(apply, test2, y=42, x="no") # E: Argument 1 to "apply" has incompatible type "Callable[[Callable[P, T], **P], None]"; expected "Callable[[Callable[[int, str], None], int, str], None]" [builtins fixtures/paramspec.pyi] [case testParamSpecApplyPosVsNamedOptional] @@ -2086,3 +2095,94 @@ reveal_type(d(b, f1)) # E: Cannot infer type argument 1 of "d" \ # N: Revealed type is "def (*Any, **Any)" reveal_type(d(b, f2)) # N: Revealed type is "def (builtins.int)" [builtins fixtures/paramspec.pyi] + +[case testParamSpecGenericWithNamedArg1] +from typing import Callable, TypeVar +from typing_extensions import ParamSpec + +R = TypeVar("R") +P = ParamSpec("P") + +def run(func: Callable[[], R], *args: object, backend: str = "asyncio") -> R: ... +class Result: ... +def run_portal() -> Result: ... +def submit(func: Callable[P, R], /, *args: P.args, **kwargs: P.kwargs) -> R: ... + +reveal_type(submit( # N: Revealed type is "__main__.Result" + run, + run_portal, + backend="asyncio", +)) +submit( + run, # E: Argument 1 to "submit" has incompatible type "Callable[[Callable[[], R], VarArg(object), DefaultNamedArg(str, 'backend')], R]"; expected "Callable[[Callable[[], Result], int], Result]" + run_portal, + backend=int(), +) +[builtins fixtures/paramspec.pyi] + +[case testParamSpecGenericWithNamedArg2] +from typing import Callable, TypeVar, Type +from typing_extensions import ParamSpec + +P= ParamSpec("P") +T = TypeVar("T") + +def smoke_testable(*args: P.args, **kwargs: P.kwargs) -> Callable[[Callable[P, T]], Type[T]]: + ... + +@smoke_testable(name="bob", size=512, flt=0.5) +class SomeClass: + def __init__(self, size: int, name: str, flt: float) -> None: + pass + +# Error message is confusing, but this is a known issue, see #4530. +@smoke_testable(name=42, size="bad", flt=0.5) # E: Argument 1 has incompatible type "Type[OtherClass]"; expected "Callable[[int, str, float], OtherClass]" +class OtherClass: + def __init__(self, size: int, name: str, flt: float) -> None: + pass +[builtins fixtures/paramspec.pyi] + +[case testInferenceAgainstGenericCallableUnionParamSpec] +from typing import Callable, TypeVar, List, Union +from typing_extensions import ParamSpec + +T = TypeVar("T") +P = ParamSpec("P") + +def dec(f: Callable[P, T]) -> Callable[P, List[T]]: ... +@dec +def func(arg: T) -> Union[T, str]: + ... +reveal_type(func) # N: Revealed type is "def [T] (arg: T`-1) -> builtins.list[Union[T`-1, builtins.str]]" +reveal_type(func(42)) # N: Revealed type is "builtins.list[Union[builtins.int, builtins.str]]" + +def dec2(f: Callable[P, List[T]]) -> Callable[P, T]: ... +@dec2 +def func2(arg: T) -> List[Union[T, str]]: + ... +reveal_type(func2) # N: Revealed type is "def [T] (arg: T`-1) -> Union[T`-1, builtins.str]" +reveal_type(func2(42)) # N: Revealed type is "Union[builtins.int, builtins.str]" +[builtins fixtures/paramspec.pyi] + +[case testParamSpecPreciseKindsUsedIfPossible] +from typing import Callable, Generic +from typing_extensions import ParamSpec + +P = ParamSpec('P') + +class Case(Generic[P]): + def __init__(self, *args: P.args, **kwargs: P.kwargs) -> None: + pass + +def _test(a: int, b: int = 0) -> None: ... + +def parametrize( + func: Callable[P, None], *cases: Case[P], **named_cases: Case[P] +) -> Callable[[], None]: + ... + +parametrize(_test, Case(1, 2), Case(3, 4)) +parametrize(_test, Case(1, b=2), Case(3, b=4)) +parametrize(_test, Case(1, 2), Case(3)) +parametrize(_test, Case(1, 2), Case(3, b=4)) +[builtins fixtures/paramspec.pyi] diff --git a/test-data/unit/check-python310.test b/test-data/unit/check-python310.test index d3cdf3af849d..cbb26a130738 100644 --- a/test-data/unit/check-python310.test +++ b/test-data/unit/check-python310.test @@ -931,7 +931,7 @@ match x: reveal_type(x) # N: Revealed type is "builtins.list[builtins.list[builtins.dict[builtins.int, builtins.int]]]" reveal_type(y) # N: Revealed type is "builtins.int" reveal_type(z) # N: Revealed type is "builtins.int" -[builtins fixtures/dict.pyi] +[builtins fixtures/dict-full.pyi] [case testMatchNonFinalMatchArgs] class A: diff --git a/test-data/unit/check-python312.test b/test-data/unit/check-python312.test index cb89eb34880c..285563c19991 100644 --- a/test-data/unit/check-python312.test +++ b/test-data/unit/check-python312.test @@ -11,6 +11,11 @@ def g(x: MyList[int]) -> MyList[int]: # E: Variable "__main__.MyList" is not va # N: See https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases return reveal_type(x) # N: Revealed type is "MyList?[builtins.int]" +type MyInt2 = int # type: ignore[valid-type] + +def h(x: MyInt2) -> MyInt2: + return reveal_type(x) # N: Revealed type is "builtins.int" + [case test695Class] class MyGen[T]: # E: PEP 695 generics are not yet supported def __init__(self, x: T) -> None: # E: Name "T" is not defined diff --git a/test-data/unit/check-selftype.test b/test-data/unit/check-selftype.test index 29abe9cb025b..e49a7a0e2e2f 100644 --- a/test-data/unit/check-selftype.test +++ b/test-data/unit/check-selftype.test @@ -2056,3 +2056,18 @@ reveal_type(C.copy(c)) # N: Revealed type is "__main__.C[builtins.int, builtins B.copy(42) # E: Value of type variable "Self" of "copy" of "B" cannot be "int" C.copy(42) # E: Value of type variable "Self" of "copy" of "B" cannot be "int" [builtins fixtures/tuple.pyi] + +[case testRecursiveSelfTypeCallMethodNoCrash] +from typing import Callable, TypeVar + +T = TypeVar("T") +class Partial: + def __call__(self: Callable[..., T]) -> T: ... + +class Partial2: + def __call__(self: Callable[..., T], x: T) -> T: ... + +p: Partial +reveal_type(p()) # N: Revealed type is "Never" +p2: Partial2 +reveal_type(p2(42)) # N: Revealed type is "builtins.int" diff --git a/test-data/unit/check-super.test b/test-data/unit/check-super.test index 48a0a0250ecf..8816322a270a 100644 --- a/test-data/unit/check-super.test +++ b/test-data/unit/check-super.test @@ -280,8 +280,12 @@ class B(A): [case testSuperOutsideMethodNoCrash] -class C: - a = super().whatever # E: super() outside of a method is not supported +class A: + x = 1 +class B(A): pass +class C(B): + b = super(B, B).x + a = super().whatever # E: "super()" outside of a method is not supported [case testSuperWithObjectClassAsFirstArgument] class A: @@ -366,13 +370,22 @@ class C(B): [case testSuperInMethodWithNoArguments] class A: def f(self) -> None: pass + @staticmethod + def st() -> int: + return 1 class B(A): def g() -> None: # E: Method must have at least one argument. Did you forget the "self" argument? - super().f() # E: super() requires one or more positional arguments in enclosing function + super().f() # E: "super()" requires one or two positional arguments in enclosing function def h(self) -> None: def a() -> None: - super().f() # E: super() requires one or more positional arguments in enclosing function + super().f() # E: "super()" requires one or two positional arguments in enclosing function + @staticmethod + def st() -> int: + reveal_type(super(B, B).st()) # N: Revealed type is "builtins.int" + super().st() # E: "super()" requires one or two positional arguments in enclosing function + return 2 +[builtins fixtures/staticmethod.pyi] [case testSuperWithUnsupportedTypeObject] from typing import Type diff --git a/test-data/unit/check-tuples.test b/test-data/unit/check-tuples.test index 4f468b59fc3f..66115ca0c30d 100644 --- a/test-data/unit/check-tuples.test +++ b/test-data/unit/check-tuples.test @@ -957,6 +957,12 @@ for x in B(), A(): [builtins fixtures/for.pyi] [case testTupleIterable] +from typing import Iterable, Optional, TypeVar + +T = TypeVar("T") + +def sum(iterable: Iterable[T], start: Optional[T] = None) -> T: pass + y = 'a' x = sum((1,2)) if int(): diff --git a/test-data/unit/check-type-aliases.test b/test-data/unit/check-type-aliases.test index 46f5ff07f1ac..4364a9bfa9dc 100644 --- a/test-data/unit/check-type-aliases.test +++ b/test-data/unit/check-type-aliases.test @@ -1065,4 +1065,4 @@ def eval(e: Expr) -> int: return e[1] elif e[0] == 456: return -eval(e[1]) -[builtins fixtures/dict.pyi] +[builtins fixtures/dict-full.pyi] diff --git a/test-data/unit/check-typeddict.test b/test-data/unit/check-typeddict.test index 088b52db0473..d8022f85574c 100644 --- a/test-data/unit/check-typeddict.test +++ b/test-data/unit/check-typeddict.test @@ -2708,7 +2708,7 @@ class TD(TypedDict): reveal_type(TD.__iter__) # N: Revealed type is "def (typing._TypedDict) -> typing.Iterator[builtins.str]" reveal_type(TD.__annotations__) # N: Revealed type is "typing.Mapping[builtins.str, builtins.object]" reveal_type(TD.values) # N: Revealed type is "def (self: typing.Mapping[T`1, T_co`2]) -> typing.Iterable[T_co`2]" -[builtins fixtures/dict.pyi] +[builtins fixtures/dict-full.pyi] [typing fixtures/typing-typeddict.pyi] [case testGenericTypedDictAlias] @@ -3299,7 +3299,7 @@ main:10: error: No overload variant of "__ror__" of "dict" matches argument type main:10: note: Possible overload variants: main:10: note: def __ror__(self, Dict[Any, Any], /) -> Dict[Any, Any] main:10: note: def [T, T2] __ror__(self, Dict[T, T2], /) -> Dict[Union[Any, T], Union[Any, T2]] -[builtins fixtures/dict.pyi] +[builtins fixtures/dict-full.pyi] [typing fixtures/typing-typeddict-iror.pyi] [case testTypedDictWith__ror__method] @@ -3379,3 +3379,62 @@ bar |= d1 # E: Argument 1 to "__ior__" of "TypedDict" has incompatible type "Di bar |= d2 # E: Argument 1 to "__ior__" of "TypedDict" has incompatible type "Dict[int, str]"; expected "TypedDict({'key'?: int, 'value'?: str})" [builtins fixtures/dict.pyi] [typing fixtures/typing-typeddict-iror.pyi] + +[case testGenericTypedDictStrictOptionalExtending] +from typing import Generic, TypeVar, TypedDict, Optional + +T = TypeVar("T") +class Foo(TypedDict, Generic[T], total=False): + a: Optional[str] + g: Optional[T] + +class Bar(Foo[T], total=False): + other: str + +b: Bar[int] +reveal_type(b["a"]) # N: Revealed type is "Union[builtins.str, None]" +reveal_type(b["g"]) # N: Revealed type is "Union[builtins.int, None]" +[builtins fixtures/dict.pyi] +[typing fixtures/typing-typeddict.pyi] + +[case testNoCrashOnUnImportedAnyNotRequired] +# flags: --disallow-any-unimported +from typing import NotRequired, Required, TypedDict +from thismoduledoesntexist import T # type: ignore[import] + +B = TypedDict("B", { # E: Type of a TypedDict key becomes "Any" due to an unfollowed import + "T1": NotRequired[T], + "T2": Required[T], +}) +[builtins fixtures/dict.pyi] +[typing fixtures/typing-typeddict.pyi] + +[case testTypedDictWithClassLevelKeywords] +from typing import TypedDict, Generic, TypeVar + +T = TypeVar('T') + +class Meta(type): ... + +class WithMetaKeyword(TypedDict, metaclass=Meta): # E: Unexpected keyword argument "metaclass" for "__init_subclass__" of "TypedDict" + ... + +class GenericWithMetaKeyword(TypedDict, Generic[T], metaclass=Meta): # E: Unexpected keyword argument "metaclass" for "__init_subclass__" of "TypedDict" + ... + +# We still don't allow this, because the implementation is much easier +# and it does not make any practical sense to do it: +class WithTypeMeta(TypedDict, metaclass=type): # E: Unexpected keyword argument "metaclass" for "__init_subclass__" of "TypedDict" + ... + +class OtherKeywords(TypedDict, a=1, b=2, c=3, total=True): # E: Unexpected keyword argument "a" for "__init_subclass__" of "TypedDict" \ + # E: Unexpected keyword argument "b" for "__init_subclass__" of "TypedDict" \ + # E: Unexpected keyword argument "c" for "__init_subclass__" of "TypedDict" + ... + +class TotalInTheMiddle(TypedDict, a=1, total=True, b=2, c=3): # E: Unexpected keyword argument "a" for "__init_subclass__" of "TypedDict" \ + # E: Unexpected keyword argument "b" for "__init_subclass__" of "TypedDict" \ + # E: Unexpected keyword argument "c" for "__init_subclass__" of "TypedDict" + ... +[builtins fixtures/dict.pyi] +[typing fixtures/typing-typeddict.pyi] diff --git a/test-data/unit/check-typeguard.test b/test-data/unit/check-typeguard.test index b3b168e5c7c6..c48887bb016a 100644 --- a/test-data/unit/check-typeguard.test +++ b/test-data/unit/check-typeguard.test @@ -694,3 +694,18 @@ def foo(x: object) -> TypeGuard[List[str]]: ... def test(f: A[T]) -> T: ... reveal_type(test(foo)) # N: Revealed type is "builtins.str" [builtins fixtures/list.pyi] + +[case testNoCrashOnDunderCallTypeGuard] +from typing_extensions import TypeGuard + +class A: + def __call__(self, x) -> TypeGuard[int]: + return True + +a: A +assert a(x=1) + +x: object +assert a(x=x) +reveal_type(x) # N: Revealed type is "builtins.int" +[builtins fixtures/tuple.pyi] diff --git a/test-data/unit/check-typevar-tuple.test b/test-data/unit/check-typevar-tuple.test index a51b535a873c..9c8d21114d4c 100644 --- a/test-data/unit/check-typevar-tuple.test +++ b/test-data/unit/check-typevar-tuple.test @@ -1789,6 +1789,24 @@ def test(a: Container[Any], b: Container[int], c: Container[str]): reveal_type(build(b, c)) # N: Revealed type is "__main__.Array[builtins.int, builtins.str]" [builtins fixtures/tuple.pyi] +[case testTypeVarTupleOverloadArbitraryLength] +from typing import Any, Tuple, TypeVar, TypeVarTuple, Unpack, overload + +T = TypeVar("T") +Ts = TypeVarTuple("Ts") +@overload +def add(self: Tuple[Unpack[Ts]], other: Tuple[T]) -> Tuple[Unpack[Ts], T]: + ... +@overload +def add(self: Tuple[T, ...], other: Tuple[T, ...]) -> Tuple[T, ...]: + ... +def add(self: Any, other: Any) -> Any: + ... +def test(a: Tuple[int, str], b: Tuple[bool], c: Tuple[bool, ...]): + reveal_type(add(a, b)) # N: Revealed type is "Tuple[builtins.int, builtins.str, builtins.bool]" + reveal_type(add(b, c)) # N: Revealed type is "builtins.tuple[builtins.bool, ...]" +[builtins fixtures/tuple.pyi] + [case testTypeVarTupleIndexOldStyleNonNormalizedAndNonLiteral] from typing import Any, Tuple from typing_extensions import Unpack @@ -2185,3 +2203,77 @@ def test2( # E: Missing named argument "b" return func(*args, **kwargs) [builtins fixtures/tuple.pyi] + +[case testUnpackTupleSpecialCaseNoCrash] +from typing import Tuple, TypeVar +from typing_extensions import Unpack + +T = TypeVar("T") + +def foo(*x: object) -> None: ... +def bar(*x: int) -> None: ... +def baz(*x: T) -> T: ... + +keys: Tuple[Unpack[Tuple[int, ...]]] + +foo(keys, 1) +foo(*keys, 1) + +bar(keys, 1) # E: Argument 1 to "bar" has incompatible type "Tuple[Unpack[Tuple[int, ...]]]"; expected "int" +bar(*keys, 1) # OK + +reveal_type(baz(keys, 1)) # N: Revealed type is "builtins.object" +reveal_type(baz(*keys, 1)) # N: Revealed type is "builtins.int" +[builtins fixtures/tuple.pyi] + +[case testVariadicTupleContextNoCrash] +from typing import Tuple, Unpack + +x: Tuple[int, Unpack[Tuple[int, ...]]] = () # E: Incompatible types in assignment (expression has type "Tuple[()]", variable has type "Tuple[int, Unpack[Tuple[int, ...]]]") +y: Tuple[int, Unpack[Tuple[int, ...]]] = (1, 2) +z: Tuple[int, Unpack[Tuple[int, ...]]] = (1,) +w: Tuple[int, Unpack[Tuple[int, ...]]] = (1, *[2, 3, 4]) +t: Tuple[int, Unpack[Tuple[int, ...]]] = (1, *(2, 3, 4)) +[builtins fixtures/tuple.pyi] + +[case testAliasToCallableWithUnpack] +from typing import Any, Callable, Tuple, Unpack + +_CallableValue = Callable[[Unpack[Tuple[Any, ...]]], Any] +def higher_order(f: _CallableValue) -> None: ... + +def good1(*args: int) -> None: ... +def good2(*args: str) -> int: ... + +def bad1(a: str, b: int, /) -> None: ... +def bad2(c: bytes, *args: int) -> str: ... +def bad3(*, d: str) -> int: ... +def bad4(**kwargs: None) -> None: ... + +higher_order(good1) +higher_order(good2) + +higher_order(bad1) # E: Argument 1 to "higher_order" has incompatible type "Callable[[str, int], None]"; expected "Callable[[VarArg(Any)], Any]" +higher_order(bad2) # E: Argument 1 to "higher_order" has incompatible type "Callable[[bytes, VarArg(int)], str]"; expected "Callable[[VarArg(Any)], Any]" +higher_order(bad3) # E: Argument 1 to "higher_order" has incompatible type "Callable[[NamedArg(str, 'd')], int]"; expected "Callable[[VarArg(Any)], Any]" +higher_order(bad4) # E: Argument 1 to "higher_order" has incompatible type "Callable[[KwArg(None)], None]"; expected "Callable[[VarArg(Any)], Any]" +[builtins fixtures/tuple.pyi] + +[case testAliasToCallableWithUnpack2] +from typing import Any, Callable, Tuple, Unpack + +_CallableValue = Callable[[int, str, Unpack[Tuple[Any, ...]], int], Any] +def higher_order(f: _CallableValue) -> None: ... + +def good(a: int, b: str, *args: Unpack[Tuple[Unpack[Tuple[Any, ...]], int]]) -> int: ... +def bad1(a: str, b: int, /) -> None: ... +def bad2(c: bytes, *args: int) -> str: ... +def bad3(*, d: str) -> int: ... +def bad4(**kwargs: None) -> None: ... + +higher_order(good) +higher_order(bad1) # E: Argument 1 to "higher_order" has incompatible type "Callable[[str, int], None]"; expected "Callable[[int, str, VarArg(Unpack[Tuple[Unpack[Tuple[Any, ...]], int]])], Any]" +higher_order(bad2) # E: Argument 1 to "higher_order" has incompatible type "Callable[[bytes, VarArg(int)], str]"; expected "Callable[[int, str, VarArg(Unpack[Tuple[Unpack[Tuple[Any, ...]], int]])], Any]" +higher_order(bad3) # E: Argument 1 to "higher_order" has incompatible type "Callable[[NamedArg(str, 'd')], int]"; expected "Callable[[int, str, VarArg(Unpack[Tuple[Unpack[Tuple[Any, ...]], int]])], Any]" +higher_order(bad4) # E: Argument 1 to "higher_order" has incompatible type "Callable[[KwArg(None)], None]"; expected "Callable[[int, str, VarArg(Unpack[Tuple[Unpack[Tuple[Any, ...]], int]])], Any]" +[builtins fixtures/tuple.pyi] diff --git a/test-data/unit/daemon.test b/test-data/unit/daemon.test index 77367eb02bfe..ca2c969d2f5e 100644 --- a/test-data/unit/daemon.test +++ b/test-data/unit/daemon.test @@ -360,6 +360,33 @@ def bar() -> None: x = foo('abc') # type: str foo(arg='xyz') +[case testDaemonInspectCheck] +$ dmypy start +Daemon started +$ dmypy check foo.py +Success: no issues found in 1 source file +$ dmypy check foo.py --export-types +Success: no issues found in 1 source file +$ dmypy inspect foo.py:1:1 +"int" +[file foo.py] +x = 1 + +[case testDaemonInspectRun] +$ dmypy run test1.py +Daemon started +Success: no issues found in 1 source file +$ dmypy run test2.py +Success: no issues found in 1 source file +$ dmypy run test1.py --export-types +Success: no issues found in 1 source file +$ dmypy inspect test1.py:1:1 +"int" +[file test1.py] +a: int +[file test2.py] +a: str + [case testDaemonGetType] $ dmypy start --log-file log.txt -- --follow-imports=error --no-error-summary --python-version 3.8 Daemon started diff --git a/test-data/unit/fine-grained-dataclass-transform.test b/test-data/unit/fine-grained-dataclass-transform.test index cc297bc344aa..89628256fda5 100644 --- a/test-data/unit/fine-grained-dataclass-transform.test +++ b/test-data/unit/fine-grained-dataclass-transform.test @@ -86,9 +86,9 @@ class A(Dataclass): [out] main:7: error: Unexpected keyword argument "x" for "B" -builtins.pyi:13: note: "B" defined here +builtins.pyi:14: note: "B" defined here main:7: error: Unexpected keyword argument "y" for "B" -builtins.pyi:13: note: "B" defined here +builtins.pyi:14: note: "B" defined here == [case frozenInheritanceViaDefault] diff --git a/test-data/unit/fixtures/dataclasses.pyi b/test-data/unit/fixtures/dataclasses.pyi index 059c853a621f..29f87ae97e62 100644 --- a/test-data/unit/fixtures/dataclasses.pyi +++ b/test-data/unit/fixtures/dataclasses.pyi @@ -3,6 +3,7 @@ from typing import ( Generic, Iterator, Iterable, Mapping, Optional, Sequence, Tuple, TypeVar, Union, overload, ) +from typing_extensions import override _T = TypeVar('_T') _U = TypeVar('_U') @@ -29,8 +30,10 @@ class dict(Mapping[KT, VT]): def __init__(self, **kwargs: VT) -> None: pass @overload def __init__(self, arg: Iterable[Tuple[KT, VT]], **kwargs: VT) -> None: pass + @override def __getitem__(self, key: KT) -> VT: pass def __setitem__(self, k: KT, v: VT) -> None: pass + @override def __iter__(self) -> Iterator[KT]: pass def __contains__(self, item: object) -> int: pass def update(self, a: Mapping[KT, VT]) -> None: pass @@ -42,7 +45,9 @@ class dict(Mapping[KT, VT]): class list(Generic[_T], Sequence[_T]): def __contains__(self, item: object) -> int: pass + @override def __getitem__(self, key: int) -> _T: pass + @override def __iter__(self) -> Iterator[_T]: pass class function: pass diff --git a/test-data/unit/fixtures/dict-full.pyi b/test-data/unit/fixtures/dict-full.pyi new file mode 100644 index 000000000000..f20369ce9332 --- /dev/null +++ b/test-data/unit/fixtures/dict-full.pyi @@ -0,0 +1,83 @@ +# Builtins stub used in dictionary-related test cases (more complete). + +from _typeshed import SupportsKeysAndGetItem +import _typeshed +from typing import ( + TypeVar, Generic, Iterable, Iterator, Mapping, Tuple, overload, Optional, Union, Sequence, + Self, +) + +T = TypeVar('T') +T2 = TypeVar('T2') +KT = TypeVar('KT') +VT = TypeVar('VT') + +class object: + def __init__(self) -> None: pass + def __init_subclass__(cls) -> None: pass + def __eq__(self, other: object) -> bool: pass + +class type: + __annotations__: Mapping[str, object] + +class dict(Mapping[KT, VT]): + @overload + def __init__(self, **kwargs: VT) -> None: pass + @overload + def __init__(self, arg: Iterable[Tuple[KT, VT]], **kwargs: VT) -> None: pass + def __getitem__(self, key: KT) -> VT: pass + def __setitem__(self, k: KT, v: VT) -> None: pass + def __iter__(self) -> Iterator[KT]: pass + def __contains__(self, item: object) -> int: pass + def update(self, a: SupportsKeysAndGetItem[KT, VT]) -> None: pass + @overload + def get(self, k: KT) -> Optional[VT]: pass + @overload + def get(self, k: KT, default: Union[VT, T]) -> Union[VT, T]: pass + def __len__(self) -> int: ... + + # This was actually added in 3.9: + @overload + def __or__(self, __value: dict[KT, VT]) -> dict[KT, VT]: ... + @overload + def __or__(self, __value: dict[T, T2]) -> dict[Union[KT, T], Union[VT, T2]]: ... + @overload + def __ror__(self, __value: dict[KT, VT]) -> dict[KT, VT]: ... + @overload + def __ror__(self, __value: dict[T, T2]) -> dict[Union[KT, T], Union[VT, T2]]: ... + # dict.__ior__ should be kept roughly in line with MutableMapping.update() + @overload # type: ignore[misc] + def __ior__(self, __value: _typeshed.SupportsKeysAndGetItem[KT, VT]) -> Self: ... + @overload + def __ior__(self, __value: Iterable[Tuple[KT, VT]]) -> Self: ... + +class int: # for convenience + def __add__(self, x: Union[int, complex]) -> int: pass + def __radd__(self, x: int) -> int: pass + def __sub__(self, x: Union[int, complex]) -> int: pass + def __neg__(self) -> int: pass + real: int + imag: int + +class str: pass # for keyword argument key type +class bytes: pass + +class list(Sequence[T]): # needed by some test cases + def __getitem__(self, x: int) -> T: pass + def __iter__(self) -> Iterator[T]: pass + def __mul__(self, x: int) -> list[T]: pass + def __contains__(self, item: object) -> bool: pass + def append(self, item: T) -> None: pass + +class tuple(Generic[T]): pass +class function: pass +class float: pass +class complex: pass +class bool(int): pass + +class ellipsis: + __class__: object +def isinstance(x: object, t: Union[type, Tuple[type, ...]]) -> bool: pass +class BaseException: pass + +def iter(__iterable: Iterable[T]) -> Iterator[T]: pass diff --git a/test-data/unit/fixtures/dict.pyi b/test-data/unit/fixtures/dict.pyi index 7c0c8767f7d7..ed2287511161 100644 --- a/test-data/unit/fixtures/dict.pyi +++ b/test-data/unit/fixtures/dict.pyi @@ -1,4 +1,7 @@ -# Builtins stub used in dictionary-related test cases. +# Builtins stub used in dictionary-related test cases (stripped down). +# +# NOTE: Use dict-full.pyi if you need more builtins instead of adding here, +# if feasible. from _typeshed import SupportsKeysAndGetItem import _typeshed @@ -14,11 +17,9 @@ VT = TypeVar('VT') class object: def __init__(self) -> None: pass - def __init_subclass__(cls) -> None: pass def __eq__(self, other: object) -> bool: pass -class type: - __annotations__: Mapping[str, object] +class type: pass class dict(Mapping[KT, VT]): @overload @@ -36,28 +37,10 @@ class dict(Mapping[KT, VT]): def get(self, k: KT, default: Union[VT, T]) -> Union[VT, T]: pass def __len__(self) -> int: ... - # This was actually added in 3.9: - @overload - def __or__(self, __value: dict[KT, VT]) -> dict[KT, VT]: ... - @overload - def __or__(self, __value: dict[T, T2]) -> dict[Union[KT, T], Union[VT, T2]]: ... - @overload - def __ror__(self, __value: dict[KT, VT]) -> dict[KT, VT]: ... - @overload - def __ror__(self, __value: dict[T, T2]) -> dict[Union[KT, T], Union[VT, T2]]: ... - # dict.__ior__ should be kept roughly in line with MutableMapping.update() - @overload # type: ignore[misc] - def __ior__(self, __value: _typeshed.SupportsKeysAndGetItem[KT, VT]) -> Self: ... - @overload - def __ior__(self, __value: Iterable[Tuple[KT, VT]]) -> Self: ... - class int: # for convenience def __add__(self, x: Union[int, complex]) -> int: pass def __radd__(self, x: int) -> int: pass def __sub__(self, x: Union[int, complex]) -> int: pass - def __neg__(self) -> int: pass - real: int - imag: int class str: pass # for keyword argument key type class bytes: pass @@ -74,10 +57,8 @@ class function: pass class float: pass class complex: pass class bool(int): pass - -class ellipsis: - __class__: object -def isinstance(x: object, t: Union[type, Tuple[type, ...]]) -> bool: pass +class ellipsis: pass class BaseException: pass +def isinstance(x: object, t: Union[type, Tuple[type, ...]]) -> bool: pass def iter(__iterable: Iterable[T]) -> Iterator[T]: pass diff --git a/test-data/unit/fixtures/len.pyi b/test-data/unit/fixtures/len.pyi index c72596661858..ee39d952701f 100644 --- a/test-data/unit/fixtures/len.pyi +++ b/test-data/unit/fixtures/len.pyi @@ -10,7 +10,7 @@ class object: class type: def __init__(self, x) -> None: pass -class tuple(Generic[T]): +class tuple(Sequence[T]): def __len__(self) -> int: pass class list(Sequence[T]): pass diff --git a/test-data/unit/fixtures/list.pyi b/test-data/unit/fixtures/list.pyi index 90fbabe8bc92..3dcdf18b2faa 100644 --- a/test-data/unit/fixtures/list.pyi +++ b/test-data/unit/fixtures/list.pyi @@ -6,6 +6,7 @@ T = TypeVar('T') class object: def __init__(self) -> None: pass + def __eq__(self, other: object) -> bool: pass class type: pass class ellipsis: pass diff --git a/test-data/unit/fixtures/tuple.pyi b/test-data/unit/fixtures/tuple.pyi index e270f3d79d3e..cb6347e9f2fd 100644 --- a/test-data/unit/fixtures/tuple.pyi +++ b/test-data/unit/fixtures/tuple.pyi @@ -49,8 +49,6 @@ class list(Sequence[T], Generic[T]): def isinstance(x: object, t: type) -> bool: pass -def sum(iterable: Iterable[T], start: Optional[T] = None) -> T: pass - class BaseException: pass class dict: pass diff --git a/test-data/unit/fixtures/typing-full.pyi b/test-data/unit/fixtures/typing-full.pyi index ef903ace78af..ca8a2413f05f 100644 --- a/test-data/unit/fixtures/typing-full.pyi +++ b/test-data/unit/fixtures/typing-full.pyi @@ -196,3 +196,6 @@ def override(__arg: T) -> T: ... # Was added in 3.11 def reveal_type(__obj: T) -> T: ... + +# Only exists in type checking time: +def type_check_only(__func_or_class: T) -> T: ... diff --git a/test-data/unit/lib-stub/typing_extensions.pyi b/test-data/unit/lib-stub/typing_extensions.pyi index 216005e3cf83..7aca6fad1b42 100644 --- a/test-data/unit/lib-stub/typing_extensions.pyi +++ b/test-data/unit/lib-stub/typing_extensions.pyi @@ -1,5 +1,5 @@ import typing -from typing import Any, Mapping, Iterable, Iterator, NoReturn as NoReturn, Dict, Tuple, Type +from typing import Any, Callable, Mapping, Iterable, Iterator, NoReturn as NoReturn, Dict, Tuple, Type from typing import TYPE_CHECKING as TYPE_CHECKING from typing import NewType as NewType, overload as overload @@ -59,6 +59,8 @@ class _TypedDict(Mapping[str, object]): # Stubtest's tests need the following items: __required_keys__: frozenset[str] __optional_keys__: frozenset[str] + __readonly_keys__: frozenset[str] + __mutable_keys__: frozenset[str] __total__: bool def TypedDict(typename: str, fields: Dict[str, Type[_T]], *, total: Any = ...) -> Type[dict]: ... @@ -75,5 +77,6 @@ def dataclass_transform( ) -> Callable[[T], T]: ... def override(__arg: _T) -> _T: ... +def deprecated(__msg: str) -> Callable[[_T], _T]: ... _FutureFeatureFixture = 0 diff --git a/test-data/unit/pythoneval.test b/test-data/unit/pythoneval.test index 7dd2b2f76f8c..c6ca71f5d56a 100644 --- a/test-data/unit/pythoneval.test +++ b/test-data/unit/pythoneval.test @@ -1568,24 +1568,24 @@ note: A user-defined top-level module with name "typing" is not supported # flags: --ignore-missing-imports import scribe # No Python 3 stubs available for scribe from scribe import x -import docutils # Python 3 stubs available for docutils +import python2 # Python 3 stubs available for python2 import foobar_asdf import jack # This has a stubs package but was never bundled with mypy, so ignoring works [out] -_testIgnoreImportIfNoPython3StubAvailable.py:4: error: Library stubs not installed for "docutils" -_testIgnoreImportIfNoPython3StubAvailable.py:4: note: Hint: "python3 -m pip install types-docutils" +_testIgnoreImportIfNoPython3StubAvailable.py:4: error: Library stubs not installed for "python2" +_testIgnoreImportIfNoPython3StubAvailable.py:4: note: Hint: "python3 -m pip install types-six" _testIgnoreImportIfNoPython3StubAvailable.py:4: note: (or run "mypy --install-types" to install all missing stub packages) _testIgnoreImportIfNoPython3StubAvailable.py:4: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports [case testNoPython3StubAvailable] import scribe from scribe import x -import docutils +import python2 [out] _testNoPython3StubAvailable.py:1: error: Cannot find implementation or library stub for module named "scribe" _testNoPython3StubAvailable.py:1: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports -_testNoPython3StubAvailable.py:3: error: Library stubs not installed for "docutils" -_testNoPython3StubAvailable.py:3: note: Hint: "python3 -m pip install types-docutils" +_testNoPython3StubAvailable.py:3: error: Library stubs not installed for "python2" +_testNoPython3StubAvailable.py:3: note: Hint: "python3 -m pip install types-six" _testNoPython3StubAvailable.py:3: note: (or run "mypy --install-types" to install all missing stub packages) diff --git a/test-data/unit/stubgen.test b/test-data/unit/stubgen.test index 895500c1ba57..cd38242ce031 100644 --- a/test-data/unit/stubgen.test +++ b/test-data/unit/stubgen.test @@ -27,20 +27,20 @@ def g(arg) -> None: ... def f(a, b=2): ... def g(b=-1, c=0): ... [out] -def f(a, b: int = ...) -> None: ... -def g(b: int = ..., c: int = ...) -> None: ... +def f(a, b: int = 2) -> None: ... +def g(b: int = -1, c: int = 0) -> None: ... [case testDefaultArgNone] def f(x=None): ... [out] from _typeshed import Incomplete -def f(x: Incomplete | None = ...) -> None: ... +def f(x: Incomplete | None = None) -> None: ... [case testDefaultArgBool] def f(x=True, y=False): ... [out] -def f(x: bool = ..., y: bool = ...) -> None: ... +def f(x: bool = True, y: bool = False) -> None: ... [case testDefaultArgBool_inspect] def f(x=True, y=False): ... @@ -48,9 +48,9 @@ def f(x=True, y=False): ... def f(x: bool = ..., y: bool = ...): ... [case testDefaultArgStr] -def f(x='foo'): ... +def f(x='foo',y="how's quotes"): ... [out] -def f(x: str = ...) -> None: ... +def f(x: str = 'foo', y: str = "how's quotes") -> None: ... [case testDefaultArgStr_inspect] def f(x='foo'): ... @@ -58,14 +58,16 @@ def f(x='foo'): ... def f(x: str = ...): ... [case testDefaultArgBytes] -def f(x=b'foo'): ... +def f(x=b'foo',y=b"what's up",z=b'\xc3\xa0 la une'): ... [out] -def f(x: bytes = ...) -> None: ... +def f(x: bytes = b'foo', y: bytes = b"what's up", z: bytes = b'\xc3\xa0 la une') -> None: ... [case testDefaultArgFloat] -def f(x=1.2): ... +def f(x=1.2,y=1e-6,z=0.0,w=-0.0,v=+1.0): ... +def g(x=float("nan"), y=float("inf"), z=float("-inf")): ... [out] -def f(x: float = ...) -> None: ... +def f(x: float = 1.2, y: float = 1e-06, z: float = 0.0, w: float = -0.0, v: float = +1.0) -> None: ... +def g(x=..., y=..., z=...) -> None: ... [case testDefaultArgOther] def f(x=ord): ... @@ -126,10 +128,10 @@ def i(a, *, b=1): ... def j(a, *, b=1, **c): ... [out] def f(a, *b, **c) -> None: ... -def g(a, *b, c: int = ...) -> None: ... -def h(a, *b, c: int = ..., **d) -> None: ... -def i(a, *, b: int = ...) -> None: ... -def j(a, *, b: int = ..., **c) -> None: ... +def g(a, *b, c: int = 1) -> None: ... +def h(a, *b, c: int = 1, **d) -> None: ... +def i(a, *, b: int = 1) -> None: ... +def j(a, *, b: int = 1, **c) -> None: ... [case testClass] class A: @@ -356,8 +358,8 @@ y: Incomplete def f(x, *, y=1): ... def g(x, *, y=1, z=2): ... [out] -def f(x, *, y: int = ...) -> None: ... -def g(x, *, y: int = ..., z: int = ...) -> None: ... +def f(x, *, y: int = 1) -> None: ... +def g(x, *, y: int = 1, z: int = 2) -> None: ... [case testProperty] class A: @@ -587,6 +589,8 @@ __all__ = [] + ['f'] def f(): ... def g(): ... [out] +__all__ = ['f'] + def f() -> None: ... [case testOmitDefsNotInAll_semanal] @@ -594,6 +598,8 @@ __all__ = ['f'] def f(): ... def g(): ... [out] +__all__ = ['f'] + def f() -> None: ... [case testOmitDefsNotInAll_inspect] @@ -601,6 +607,8 @@ __all__ = [] + ['f'] def f(): ... def g(): ... [out] +__all__ = ['f'] + def f(): ... [case testVarDefsNotInAll_import] @@ -610,6 +618,8 @@ x = 1 y = 1 def g(): ... [out] +__all__ = ['f', 'g'] + def f() -> None: ... def g() -> None: ... @@ -620,6 +630,8 @@ x = 1 y = 1 def g(): ... [out] +__all__ = ['f', 'g'] + def f(): ... def g(): ... @@ -628,6 +640,8 @@ __all__ = [] + ['f'] def f(): ... class A: ... [out] +__all__ = ['f'] + def f() -> None: ... class A: ... @@ -637,6 +651,8 @@ __all__ = [] + ['f'] def f(): ... class A: ... [out] +__all__ = ['f'] + def f(): ... class A: ... @@ -647,6 +663,8 @@ class A: x = 1 def f(self): ... [out] +__all__ = ['A'] + class A: x: int def f(self) -> None: ... @@ -684,6 +702,8 @@ x = 1 [out] from re import match as match, sub as sub +__all__ = ['match', 'sub', 'x'] + x: int [case testExportModule_import] @@ -694,6 +714,8 @@ y = 2 [out] import re as re +__all__ = ['re', 'x'] + x: int [case testExportModule2_import] @@ -704,6 +726,8 @@ y = 2 [out] import re as re +__all__ = ['re', 'x'] + x: int [case testExportModuleAs_import] @@ -714,6 +738,8 @@ y = 2 [out] import re as rex +__all__ = ['rex', 'x'] + x: int [case testExportModuleInPackage_import] @@ -722,6 +748,8 @@ __all__ = ['p'] [out] import urllib.parse as p +__all__ = ['p'] + [case testExportPackageOfAModule_import] import urllib.parse __all__ = ['urllib'] @@ -729,6 +757,8 @@ __all__ = ['urllib'] [out] import urllib as urllib +__all__ = ['urllib'] + [case testRelativeImportAll] from .x import * [out] @@ -741,6 +771,8 @@ x = 1 class C: def g(self): ... [out] +__all__ = ['f', 'x', 'C', 'g'] + def f() -> None: ... x: int @@ -758,6 +790,8 @@ x = 1 class C: def g(self): ... [out] +__all__ = ['f', 'x', 'C', 'g'] + def f(): ... x: int @@ -1253,8 +1287,8 @@ from _typeshed import Incomplete class A: x: Incomplete - def __init__(self, a: Incomplete | None = ...) -> None: ... - def method(self, a: Incomplete | None = ...) -> None: ... + def __init__(self, a: Incomplete | None = None) -> None: ... + def method(self, a: Incomplete | None = None) -> None: ... [case testAnnotationImportsFrom] import foo @@ -2343,6 +2377,8 @@ else: [out] import cookielib as cookielib +__all__ = ['cookielib'] + [case testCannotCalculateMRO_semanal] class X: pass @@ -2480,7 +2516,7 @@ from _typeshed import Incomplete as _Incomplete Y: _Incomplete -def g(x: _Incomplete | None = ...) -> None: ... +def g(x: _Incomplete | None = None) -> None: ... x: _Incomplete @@ -2788,6 +2824,8 @@ class A: pass # p/__init__.pyi from p.a import A +__all__ = ['a'] + a: A # p/a.pyi class A: ... @@ -2961,7 +2999,9 @@ __uri__ = '' __version__ = '' [out] -from m import __version__ as __version__ +from m import __about__ as __about__, __author__ as __author__, __version__ as __version__ + +__all__ = ['__about__', '__author__', '__version__'] [case testAttrsClass_semanal] import attrs @@ -3465,7 +3505,7 @@ class P(Protocol): [case testNonDefaultKeywordOnlyArgAfterAsterisk] def func(*, non_default_kwarg: bool, default_kwarg: bool = True): ... [out] -def func(*, non_default_kwarg: bool, default_kwarg: bool = ...): ... +def func(*, non_default_kwarg: bool, default_kwarg: bool = True): ... [case testNestedGenerator] def f1(): @@ -3871,6 +3911,53 @@ def gen2() -> _Generator[_Incomplete, _Incomplete, _Incomplete]: ... class X(_Incomplete): ... class Y(_Incomplete): ... +[case testIgnoreLongDefaults] +def f(x='abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz\ +abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz\ +abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz\ +abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz'): ... + +def g(x=b'abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz\ +abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz\ +abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz\ +abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz'): ... + +def h(x=123456789012345678901234567890123456789012345678901234567890\ +123456789012345678901234567890123456789012345678901234567890\ +123456789012345678901234567890123456789012345678901234567890\ +123456789012345678901234567890123456789012345678901234567890): ... + +[out] +def f(x: str = ...) -> None: ... +def g(x: bytes = ...) -> None: ... +def h(x: int = ...) -> None: ... + +[case testDefaultsOfBuiltinContainers] +def f(x=(), y=(1,), z=(1, 2)): ... +def g(x=[], y=[1, 2]): ... +def h(x={}, y={1: 2, 3: 4}): ... +def i(x={1, 2, 3}): ... +def j(x=[(1,"a"), (2,"b")]): ... + +[out] +def f(x=(), y=(1,), z=(1, 2)) -> None: ... +def g(x=[], y=[1, 2]) -> None: ... +def h(x={}, y={1: 2, 3: 4}) -> None: ... +def i(x={1, 2, 3}) -> None: ... +def j(x=[(1, 'a'), (2, 'b')]) -> None: ... + +[case testDefaultsOfBuiltinContainersWithNonTrivialContent] +def f(x=(1, u.v), y=(k(),), z=(w,)): ... +def g(x=[1, u.v], y=[k()], z=[w]): ... +def h(x={1: u.v}, y={k(): 2}, z={m: m}, w={**n}): ... +def i(x={u.v, 2}, y={3, k()}, z={w}): ... + +[out] +def f(x=..., y=..., z=...) -> None: ... +def g(x=..., y=..., z=...) -> None: ... +def h(x=..., y=..., z=..., w=...) -> None: ... +def i(x=..., y=..., z=...) -> None: ... + [case testDataclass] import dataclasses import dataclasses as dcs