Skip to content

Tensorflow keras layer #9707

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Mar 9, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions stubs/tensorflow/@tests/stubtest_allowlist.txt
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,52 @@ tensorflow.DType.__getattr__
tensorflow.Graph.__getattr__
tensorflow.Operation.__getattr__
tensorflow.Variable.__getattr__
tensorflow.keras.layers.Layer.__getattr__
# Internal undocumented API
tensorflow.RaggedTensor.__init__
# Has an undocumented extra argument that tf.Variable which acts like subclass
# (by dynamically patching tf.Tensor methods) does not preserve.
tensorflow.Tensor.__getitem__
# stub internal utility
tensorflow._aliases

# Tensorflow imports are cursed.
# import tensorflow.initializers
# import tensorflow as tf
# tf.initializers
# Usually these two ways are same module, but for tensorflow the first way
# often does not work and the second way does. The documentation describes
# tf.initializers as module and has that type if accessed the second way,
# but the real module file is completely different name (even package) and dynamically handled.
# tf.initializers at runtime is <module 'keras.api._v2.keras.initializers' from '...'>
tensorflow.initializers
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is probably most cursed weirdness of imports (pylint is unhappy too with this). A lot of other tf modules do similar thing and this part of allowlist will grow over time.


# Layer constructor's always have **kwargs, but only allow a few specific values. PEP 692
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we have PEP 692 tracker ticket yet. Will make one later if no one covers it first.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think PEP 692 might already have sufficient support to use it; at least mypy and pyright support it. Haven't checked pytype though.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testing basic 692 support:

class Foo(TypedDict, total=False):
    a: int
    b: str


def f(**kwargs: Unpack[Foo]) -> None:
    ...


f(a=1, b="2")  # OK
f(a=1, b=2)  # Error: b has type str
f(a=1, b="2", c=3)  # Error: unexpected keyword argument "c"
  • pyright works fine
  • mypy works if we add --enable-incomplete-feature=Unpack (otherwise it gives error: "Unpack" support is experimental, use --enable-incomplete-feature=Unpack to enable)
  • pytype produces typing_extensions.Unpack not supported yet [not-supported-yet]. No errors otherwise.
  • pyre is same as pytype and just produces pep_692.py:11:16 Undefined or invalid type [11]: Annotation Unpack is not defined as a type.

I'll open ticket to track.

# would allow us to specify this with **kwargs and remove the need for these exceptions.
tensorflow.keras.layers.*.__init__

# __call__ in tensorflow classes often allow keyword usage, but
# when you subclass those classes it is not expected to handle keyword case. As an example,
# class MyLayer(tf.keras.layers.Layer):
# def call(self, x):
# ...
# is common even though Layer.call is defined like def call(self, inputs). Treating inputs as
# a keyword argument would lead to many false positives with typical subclass usage.
# Additional awkwardness for Layer's is call may optionally have training/mask as keyword arguments and some
# layers do while others do not. At runtime call is not intended to be used directly by users,
# but instead through __call__ which extracts out the training/mask arguments. Trying to describe
# this better in stubs would similarly add many false positive Liskov violations.
tensorflow.keras.layers.*.call
tensorflow.keras.regularizers.Regularizer.__call__
tensorflow.keras.constraints.Constraint.__call__

# Layer class does good deal of __new__ magic and actually returns one of two different internal
# types depending on tensorflow execution mode. This feels like implementation internal.
tensorflow.keras.layers.Layer.__new__

# build/compute_output_shape are marked positional only in stubs
# as argument name is inconsistent across layer's and looks like
# an implementation detail as documentation never mentions the
# disagreements.
tensorflow.keras.layers.*.build
tensorflow.keras.layers.*.compute_output_shape
33 changes: 31 additions & 2 deletions stubs/tensorflow/tensorflow/__init__.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,12 @@ from builtins import bool as _bool
from collections.abc import Callable, Iterable, Iterator, Sequence
from contextlib import contextmanager
from enum import Enum
from typing import Any, NoReturn, overload
from typing_extensions import Self, TypeAlias
from types import TracebackType
from typing import Any, NoReturn, TypeVar, overload
from typing_extensions import ParamSpec, Self, TypeAlias

import numpy
from tensorflow import initializers as initializers, keras as keras, math as math

# Explicit import of DType is covered by the wildcard, but
# is necessary to avoid a crash in pytype.
Expand Down Expand Up @@ -253,4 +255,31 @@ class IndexedSlices(metaclass=ABCMeta):
def __neg__(self) -> IndexedSlices: ...
def consumers(self) -> list[Operation]: ...

class name_scope:
def __init__(self, name: str) -> None: ...
def __enter__(self) -> str: ...
def __exit__(self, typ: type[BaseException] | None, value: BaseException | None, traceback: TracebackType | None) -> None: ...

_P = ParamSpec("_P")
_R = TypeVar("_R")

class Module:
def __init__(self, name: str | None = None) -> None: ...
@property
def name(self) -> str: ...
@property
def name_scope(self) -> name_scope: ...
# Documentation only specifies these as returning Sequence. Actual
# implementation does tuple.
@property
def variables(self) -> Sequence[Variable]: ...
@property
def trainable_variables(self) -> Sequence[Variable]: ...
@property
def non_trainable_variables(self) -> Sequence[Variable]: ...
@property
def submodules(self) -> Sequence[Module]: ...
@classmethod
def with_name_scope(cls, method: Callable[_P, _R]) -> Callable[_P, _R]: ...

def __getattr__(name: str) -> Incomplete: ...
14 changes: 14 additions & 0 deletions stubs/tensorflow/tensorflow/_aliases.pyi
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Commonly used type aliases.
# Everything in this module is private for stubs. There is no runtime
# equivalent.

from collections.abc import Mapping, Sequence
from typing import Any, TypeVar
from typing_extensions import TypeAlias

import numpy

_T1 = TypeVar("_T1")
ContainerGeneric: TypeAlias = Mapping[str, ContainerGeneric[_T1]] | Sequence[ContainerGeneric[_T1]] | _T1

AnyArray: TypeAlias = numpy.ndarray[Any, Any]
1 change: 1 addition & 0 deletions stubs/tensorflow/tensorflow/initializers.pyi
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from tensorflow.keras.initializers import *
11 changes: 11 additions & 0 deletions stubs/tensorflow/tensorflow/keras/__init__.pyi
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
from _typeshed import Incomplete

from tensorflow.keras import (
activations as activations,
constraints as constraints,
initializers as initializers,
layers as layers,
regularizers as regularizers,
)

def __getattr__(name: str) -> Incomplete: ...
12 changes: 12 additions & 0 deletions stubs/tensorflow/tensorflow/keras/activations.pyi
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
from _typeshed import Incomplete
from collections.abc import Callable
from typing import Any
from typing_extensions import TypeAlias

from tensorflow import Tensor

# The implementation uses isinstance so it must be dict and not any Mapping.
_Activation: TypeAlias = str | None | Callable[[Tensor], Tensor] | dict[str, Any]

def get(identifier: _Activation) -> Callable[[Tensor], Tensor]: ...
def __getattr__(name: str) -> Incomplete: ...
17 changes: 17 additions & 0 deletions stubs/tensorflow/tensorflow/keras/constraints.pyi
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
from _typeshed import Incomplete
from collections.abc import Callable
from typing import Any, overload

from tensorflow import Tensor

class Constraint:
def get_config(self) -> dict[str, Any]: ...
def __call__(self, __w: Tensor) -> Tensor: ...

@overload
def get(identifier: None) -> None: ...
@overload
def get(identifier: str | dict[str, Any] | Constraint) -> Constraint: ...
@overload
def get(identifier: Callable[[Tensor], Tensor]) -> Callable[[Tensor], Tensor]: ...
def __getattr__(name: str) -> Incomplete: ...
50 changes: 50 additions & 0 deletions stubs/tensorflow/tensorflow/keras/initializers.pyi
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
from _typeshed import Incomplete
from collections.abc import Callable
from typing import Any, overload
from typing_extensions import Self, TypeAlias

from tensorflow import Tensor, _DTypeLike, _ShapeLike, _TensorCompatible

class Initializer:
def __call__(self, shape: _ShapeLike, dtype: _DTypeLike | None = None) -> Tensor: ...
def get_config(self) -> dict[str, Any]: ...
@classmethod
def from_config(cls, config: dict[str, Any]) -> Self: ...

class Constant(Initializer):
def __init__(self, value: _TensorCompatible = 0) -> None: ...

class GlorotNormal(Initializer):
def __init__(self, seed: int | None = None) -> None: ...

class GlorotUniform(Initializer):
def __init__(self, seed: int | None = None) -> None: ...

class TruncatedNormal(Initializer):
def __init__(self, mean: _TensorCompatible = 0.0, stddev: _TensorCompatible = 0.05, seed: int | None = None) -> None: ...

class RandomNormal(Initializer):
def __init__(self, mean: _TensorCompatible = 0.0, stddev: _TensorCompatible = 0.05, seed: int | None = None) -> None: ...

class RandomUniform(Initializer):
def __init__(self, minval: _TensorCompatible = -0.05, maxval: _TensorCompatible = 0.05, seed: int | None = None) -> None: ...

class Zeros(Initializer): ...

constant = Constant
glorot_normal = GlorotNormal
glorot_uniform = GlorotUniform
truncated_normal = TruncatedNormal
zeros = Zeros

_Initializer: TypeAlias = ( # noqa: Y047
str | Initializer | type[Initializer] | Callable[[_ShapeLike], Tensor] | dict[str, Any] | None
)

@overload
def get(identifier: None) -> None: ...
@overload
def get(identifier: str | Initializer | dict[str, Any] | type[Initializer]) -> Initializer: ...
@overload
def get(identifier: Callable[[_ShapeLike], Tensor]) -> Callable[[_ShapeLike], Tensor]: ...
def __getattr__(name: str) -> Incomplete: ...
Loading