Skip to content

[WIP] Support non-type dependencies #9374

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 6 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 4 additions & 3 deletions .github/workflows/daily.yml
Original file line number Diff line number Diff line change
Expand Up @@ -72,13 +72,14 @@ jobs:
shell: bash
run: |
PACKAGES=$(python tests/get_packages.py)
PYTHON_EXECUTABLE="python"

if [ "${{ runner.os }}" = "Linux" ]; then
if [ -n "$PACKAGES" ]; then
sudo apt update && sudo apt install -y $PACKAGES
fi

xvfb-run python tests/stubtest_third_party.py --specified-stubs-only --num-shards 4 --shard-index ${{ matrix.shard-index }}
PYTHON_EXECUTABLE="xvfb-run python"
else
if [ "${{ runner.os }}" = "macOS" ] && [ -n "$PACKAGES" ]; then
NONINTERACTIVE=1 brew install $PACKAGES
Expand All @@ -87,10 +88,10 @@ jobs:
if [ "${{ runner.os }}" = "Windows" ] && [ -n "$PACKAGES" ]; then
choco install -y $PACKAGES
fi

python tests/stubtest_third_party.py --specified-stubs-only --num-shards 4 --shard-index ${{ matrix.shard-index }}
fi

$PYTHON_EXECUTABLE tests/stubtest_third_party.py --specified-stubs-only --num-shards 4 --shard-index ${{ matrix.shard-index }}
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@AlexWaygood is this code de-duplication worth making a PR for? (adds a line, but makes the command to run have a single source of truth)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, seems like a nice improvement :)

I'd vote for this control flow, though:

if [ "${{ runner.os }}" = "Linux" ]; then
  PYTHON_EXECUTABLE="xvfb-run python"
else
  PYTHON_EXECUTABLE="python"
fi

rather than

PYTHON_EXECUTABLE="python"

if [ "${{ runner.os }}" = "Linux" ]; then
  PYTHON_EXECUTABLE="xvfb-run python"
fi

I find the former much more readable.


stub-uploader:
name: Run the stub_uploader tests
runs-on: ubuntu-latest
Expand Down
7 changes: 4 additions & 3 deletions .github/workflows/stubtest_third_party.yml
Original file line number Diff line number Diff line change
Expand Up @@ -59,14 +59,15 @@ jobs:
if [ -n "$STUBS" ]; then
echo "Testing $STUBS..."
PACKAGES=$(python tests/get_packages.py $STUBS)
PYTHON_EXECUTABLE="python"

if [ "${{ runner.os }}" = "Linux" ]; then
if [ -n "$PACKAGES" ]; then
echo "Installing apt packages: $PACKAGES"
sudo apt update && sudo apt install -y $PACKAGES
fi

xvfb-run python tests/stubtest_third_party.py --specified-stubs-only $STUBS
PYTHON_EXECUTABLE="xvfb-run python"
else
if [ "${{ runner.os }}" = "macOS" ] && [ -n "$PACKAGES" ]; then
echo "Installing Homebrew packages: $PACKAGES"
Expand All @@ -77,9 +78,9 @@ jobs:
echo "Installing Chocolatey packages: $PACKAGES"
choco install -y $PACKAGES
fi

python tests/stubtest_third_party.py --specified-stubs-only $STUBS
fi

$PYTHON_EXECUTABLE tests/stubtest_third_party.py --specified-stubs-only $STUBS
else
echo "Nothing to test"
fi
8 changes: 8 additions & 0 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ jobs:
cache: pip
cache-dependency-path: requirements-tests.txt
- run: pip install -r requirements-tests.txt
- run: pip install $(python tests/external_pip_dependencies.py)
- run: ./tests/pytype_test.py --print-stderr

mypy:
Expand Down Expand Up @@ -112,6 +113,13 @@ jobs:
fail-fast: false
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: requirements-tests.txt
- run: pip install tomli
- run: pip install $(python tests/external_pip_dependencies.py)
- name: Get pyright version
uses: SebRollen/toml-action@v1.0.2
id: pyright_version
Expand Down
6 changes: 5 additions & 1 deletion pyrightconfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
"strictDictionaryInference": true,
"strictSetInference": true,
"reportFunctionMemberAccess": "error",
"reportMissingTypeStubs": "error",
"reportUnusedImport": "error",
"reportUnusedClass": "error",
"reportUnusedFunction": "error",
Expand All @@ -38,6 +37,11 @@
"reportUnnecessaryTypeIgnoreComment": "error",
// Leave "type: ignore" comments to mypy
"enableTypeIgnoreComments": false,
// Since we allow non-type dependencies, we may need to refer to a library
// that is not marked as "typed" and has no stubs.
// It's still useful to raise a warning to flag those dependencies.
"useLibraryCodeForTypes": true,
"reportMissingTypeStubs": "warning",
// Stubs are allowed to use private variables
"reportPrivateUsage": "none",
// Stubs don't need the actual modules to be installed
Expand Down
5 changes: 5 additions & 0 deletions pyrightconfig.stricter.json
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,11 @@
"reportPrivateUsage": "none",
// TODO: Complete incomplete stubs
"reportIncompleteStub": "none",
// Since we allow non-type dependencies, we may need to refer to a library
// that is not marked as "typed" and has no stubs.
// It's still useful to raise a warning to flag those dependencies.
"useLibraryCodeForTypes": true,
"reportMissingTypeStubs": "warning",
// Stubs don't need the actual modules to be installed
"reportMissingModuleSource": "none",
// Incompatible overrides and property type mismatches are out of typeshed's control
Expand Down
2 changes: 1 addition & 1 deletion stubs/D3DShot/METADATA.toml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
version = "0.1.*"
requires = ["types-Pillow"]
requires = ["types-Pillow", "numpy", "torch", "comtypes"]

[tool.stubtest]
# TODO: figure out how to run stubtest for this package
Expand Down
12 changes: 4 additions & 8 deletions stubs/D3DShot/d3dshot/capture_output.pyi
Original file line number Diff line number Diff line change
@@ -1,18 +1,14 @@
import enum
from _typeshed import Incomplete
from collections.abc import Sequence
from ctypes import _CVoidConstPLike
from typing_extensions import Literal, TypeAlias

import numpy as np
import numpy.typing as npt
from PIL import Image
from torch import Tensor

_Frame: TypeAlias = Image.Image | Incomplete
# TODO: Complete types once we can import non-types dependencies
# See: #5768
# from torch import Tensor
# from comtypes import IUnknown
# import numpy.typing as npt
# _Frame: TypeAlias = Image.Image | npt.NDArray[np.int32] | npt.NDArray[np.float32] | Tensor
_Frame: TypeAlias = Image.Image | npt.NDArray[np.int32] | npt.NDArray[np.float32] | Tensor

class CaptureOutputs(enum.Enum):
PIL: int
Expand Down
16 changes: 7 additions & 9 deletions stubs/D3DShot/d3dshot/capture_outputs/numpy_capture_output.pyi
Original file line number Diff line number Diff line change
@@ -1,17 +1,13 @@
from _typeshed import Incomplete
from collections.abc import Sequence
from ctypes import _CVoidConstPLike
from typing_extensions import Literal, TypeAlias

import numpy as np
import numpy.typing as npt
from d3dshot.capture_output import CaptureOutput
from PIL import Image

# TODO: Complete types once we can import non-types dependencies
# See: #5768
# import numpy as np
# import numpy.typing as npt
# _NDArray: TypeAlias = npt.NDArray[np.int32]
_NDArray: TypeAlias = Incomplete
_NDArray: TypeAlias = npt.NDArray[np.int32]

class NumpyCaptureOutput(CaptureOutput):
def __init__(self) -> None: ...
Expand All @@ -25,5 +21,7 @@ class NumpyCaptureOutput(CaptureOutput):
region: tuple[int, int, int, int],
rotation: int,
) -> _NDArray: ...
def to_pil(self, frame: _NDArray) -> Image.Image: ...
def stack(self, frames: Sequence[_NDArray] | _NDArray, stack_dimension: Literal["first", "last"]) -> _NDArray: ...
def to_pil(self, frame: _NDArray) -> Image.Image: ... # type: ignore[override]
def stack( # type: ignore[override]
self, frames: Sequence[_NDArray] | _NDArray, stack_dimension: Literal["first", "last"]
) -> _NDArray: ...
Original file line number Diff line number Diff line change
@@ -1,5 +1,27 @@
from collections.abc import Sequence
from ctypes import _CVoidConstPLike
from typing_extensions import Literal, TypeAlias

import numpy as np
import numpy.typing as npt
from d3dshot.capture_outputs.numpy_capture_output import NumpyCaptureOutput
from PIL import Image

_NDArray: TypeAlias = npt.NDArray[np.float64]

# TODO: Once we can import non-types dependencies, this CaptureOutput should be float based
# See: #5768
class NumpyFloatCaptureOutput(NumpyCaptureOutput): ...
class NumpyFloatCaptureOutput(NumpyCaptureOutput):
def __init__(self) -> None: ...
def process( # type: ignore[override]
self,
pointer: _CVoidConstPLike,
pitch: float,
size: int,
width: float,
height: float,
region: tuple[float, float, float, float],
rotation: int,
) -> _NDArray: ...
def to_pil(self, frame: _NDArray) -> Image.Image: ... # type: ignore[override]
def stack( # type: ignore[override]
self, frames: Sequence[_NDArray] | _NDArray, stack_dimension: Literal["first", "last"]
) -> _NDArray: ...
4 changes: 2 additions & 2 deletions stubs/D3DShot/d3dshot/capture_outputs/pil_capture_output.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -21,5 +21,5 @@ class PILCaptureOutput(CaptureOutput):
region: tuple[int, int, int, int],
rotation: int,
) -> Image.Image: ...
def to_pil(self, frame: _ImageT) -> _ImageT: ...
def stack(self, frames: Sequence[_ImageT], stack_dimension: _Unused) -> Sequence[_ImageT]: ...
def to_pil(self, frame: _ImageT) -> _ImageT: ... # type: ignore[override]
def stack(self, frames: Sequence[_ImageT], stack_dimension: _Unused) -> Sequence[_ImageT]: ... # type: ignore[override]
15 changes: 5 additions & 10 deletions stubs/D3DShot/d3dshot/capture_outputs/pytorch_capture_output.pyi
Original file line number Diff line number Diff line change
@@ -1,15 +1,10 @@
from _typeshed import Incomplete
from collections.abc import Sequence
from ctypes import _CVoidConstPLike
from typing_extensions import Literal, TypeAlias
from typing_extensions import Literal

from d3dshot.capture_output import CaptureOutput
from PIL import Image

# TODO: Complete types once we can import non-types dependencies
# See: https://github.com/python/typeshed/issues/5768
# from torch import Tensor
_Tensor: TypeAlias = Incomplete
from torch import Tensor

class PytorchCaptureOutput(CaptureOutput):
def __init__(self) -> None: ...
Expand All @@ -22,6 +17,6 @@ class PytorchCaptureOutput(CaptureOutput):
height: int,
region: tuple[int, int, int, int],
rotation: int,
) -> _Tensor: ...
def to_pil(self, frame: _Tensor) -> Image.Image: ...
def stack(self, frames: Sequence[_Tensor], stack_dimension: Literal["first", "last"]) -> _Tensor: ...
) -> Tensor: ...
def to_pil(self, frame: Tensor) -> Image.Image: ... # type: ignore[override]
def stack(self, frames: Sequence[Tensor], stack_dimension: Literal["first", "last"]) -> Tensor: ... # type: ignore[override]
6 changes: 3 additions & 3 deletions stubs/D3DShot/d3dshot/dll/__init__.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ from ctypes.wintypes import PFLOAT
from typing import TypeVar
from typing_extensions import TypeAlias

from comtypes import IUnknown
from d3dshot.capture_output import _Frame

_ProcessFuncRegionArg = TypeVar("_ProcessFuncRegionArg", tuple[int, int, int, int], None)
Expand All @@ -20,9 +21,8 @@ if sys.platform == "win32":
else:
_HRESULT: TypeAlias = Incomplete

# TODO: Use comtypes.IUnknown once we can import non-types dependencies
# See: #5768
class _IUnknown(_CData):
# Type issue in comtypes: Type "IUnknown" cannot be assigned to type "_CData"
class _IUnknown(IUnknown, _CData):
def QueryInterface(self, interface: type, iid: _CData | None = ...) -> _HRESULT: ...
def AddRef(self) -> c_ulong: ...
def Release(self) -> c_ulong: ...
1 change: 1 addition & 0 deletions stubs/JACK-Client/METADATA.toml
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
version = "0.5.*"
requires = ["numpy"]

[tool.stubtest]
ignore_missing_stub = false
Expand Down
5 changes: 4 additions & 1 deletion stubs/JACK-Client/jack/__init__.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,10 @@ from collections.abc import Callable, Generator, Iterable, Iterator, Sequence
from typing import Any, overload
from typing_extensions import Literal, TypeAlias

_NDArray: TypeAlias = Any # FIXME: no typings for numpy arrays
import numpy as np
import numpy.typing as npt

_NDArray: TypeAlias = npt.NDArray[np.generic] # incomplete: np.generic may be too wide

class _JackPositionT: ...

Expand Down
1 change: 1 addition & 0 deletions stubs/pycocotools/METADATA.toml
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
version = "2.0.*"
requires = ["numpy"]

[tool.stubtest]
ignore_missing_stub = false
17 changes: 6 additions & 11 deletions stubs/pycocotools/pycocotools/coco.pyi
Original file line number Diff line number Diff line change
@@ -1,17 +1,14 @@
from _typeshed import Incomplete
from collections.abc import Collection, Sequence
from pathlib import Path
from typing import Generic, TypeVar, overload
from typing_extensions import Literal, TypeAlias, TypedDict

from . import _EncodedRLE
import numpy as np
import numpy.typing as npt

# TODO: Use numpy types when #5768 is resolved.
# import numpy as np
# import numpy.typing as npt
from . import _EncodedRLE

PYTHON_VERSION: Incomplete
_NDArray: TypeAlias = Incomplete
PYTHON_VERSION: int

class _Image(TypedDict):
id: int
Expand Down Expand Up @@ -82,13 +79,11 @@ class COCO:
def showAnns(self, anns: Sequence[_Annotation], draw_bbox: bool = ...) -> None: ...
def loadRes(self, resFile: str) -> COCO: ...
def download(self, tarDir: str | None = ..., imgIds: Collection[int] = ...) -> Literal[-1] | None: ...
def loadNumpyAnnotations(self, data: _NDArray) -> list[_Annotation]: ...
# def loadNumpyAnnotations(self, data: npt.NDArray[np.float64]) -> list[_Annotation]: ...
def loadNumpyAnnotations(self, data: npt.NDArray[np.float64]) -> list[_Annotation]: ...
@overload
def annToRLE(self, ann: _AnnotationG[_RLE]) -> _RLE: ...
@overload
def annToRLE(self, ann: _AnnotationG[_EncodedRLE]) -> _EncodedRLE: ...
@overload
def annToRLE(self, ann: _AnnotationG[_TPolygonSegmentation]) -> _EncodedRLE: ...
def annToMask(self, ann: _Annotation) -> _NDArray: ...
# def annToMask(self, ann: _Annotation) -> npt.NDArray[np.uint8]: ...
def annToMask(self, ann: _Annotation) -> npt.NDArray[np.uint8]: ...
36 changes: 12 additions & 24 deletions stubs/pycocotools/pycocotools/cocoeval.pyi
Original file line number Diff line number Diff line change
@@ -1,13 +1,10 @@
from _typeshed import Incomplete
from typing_extensions import Literal, TypeAlias, TypedDict

from .coco import COCO
import numpy as np
import numpy.typing as npt

# TODO: Use numpy types when #5768 is resolved.
# import numpy as np
# import numpy.typing as npt
from .coco import COCO

_NDArray: TypeAlias = Incomplete
_TIOU: TypeAlias = Literal["segm", "bbox", "keypoints"]

class _EvaluationResult(TypedDict):
Expand All @@ -17,47 +14,38 @@ class _EvaluationResult(TypedDict):
maxDet: int
dtIds: list[int]
gtIds: list[int]
dtMatches: _NDArray
# dtMatches: npt.NDArray[np.float64]
gtMatches: _NDArray
# gtMatches: npt.NDArray[np.float64]
dtMatches: npt.NDArray[np.float64]
gtMatches: npt.NDArray[np.float64]
dtScores: list[float]
gtIgnore: _NDArray
# gtIgnore: npt.NDArray[np.float64]
dtIgnore: _NDArray
# dtIgnore: npt.NDArray[np.float64]
gtIgnore: npt.NDArray[np.float64]
dtIgnore: npt.NDArray[np.float64]

class COCOeval:
cocoGt: COCO
cocoDt: COCO
evalImgs: list[_EvaluationResult]
eval: _EvaluationResult
params: Params
stats: _NDArray
# stats: npt.NDArray[np.float64]
stats: npt.NDArray[np.float64]
ious: dict[tuple[int, int], list[float]]
def __init__(self, cocoGt: COCO | None = ..., cocoDt: COCO | None = ..., iouType: _TIOU = ...) -> None: ...
def evaluate(self) -> None: ...
def computeIoU(self, imgId: int, catId: int) -> list[float]: ...
def computeOks(self, imgId: int, catId: int) -> _NDArray: ...
# def computeOks(self, imgId: int, catId: int) -> npt.NDArray[np.float64]: ...
def computeOks(self, imgId: int, catId: int) -> npt.NDArray[np.float64]: ...
def evaluateImg(self, imgId: int, catId: int, aRng: list[int], maxDet: int) -> _EvaluationResult: ...
def accumulate(self, p: Params | None = ...) -> None: ...
def summarize(self) -> None: ...

class Params:
imgIds: list[int]
catIds: list[int]
iouThrs: _NDArray
# iouThrs: npt.NDArray[np.float64]
recThrs: _NDArray
# recThrs: npt.NDArray[np.float64]
iouThrs: npt.NDArray[np.float64]
recThrs: npt.NDArray[np.float64]
maxDets: list[int]
areaRng: list[int]
areaRngLbl: list[str]
useCats: int
kpt_oks_sigmas: _NDArray
# kpt_oks_sigmas: npt.NDArray[np.float64]
kpt_oks_sigmas: npt.NDArray[np.float64]
iouType: _TIOU
useSegm: int | None
def __init__(self, iouType: _TIOU = ...) -> None: ...
Expand Down
Loading