Description
Right now it doesn't look like there's a way to specify class and object fields to have types. Of course, the analyzer can see method definitions and use them to create class/object types, but what's the right thing to do with a class definition like
class C:
x = 42
Here there is no annotation to suggest that there would be anything wrong with a statement like C.x='foo''
or C().x='bar'
, and the programmer might want them to succeed (i.e. C.x
has type Any
); on the other hand it seems highly desirable to be able to specify that x
is an int
in some way if that's their expected behavior. The # type :
syntax could work here, but it would be nice to have an alternative for analyzers that don't parse comments.
Additionally, having a distinction between class fields and object fields would be nice. This enable analyzers to report that, in a program like
class D:
def __init__(self):
self.x = 42
the field lookup D().x
is well-typed and D.x
might not be. It seems like mypy doesn't have this distinction, though I may be missing a way to do so.
Two ways to do this that come to mind: any assignments to self
in __init__
will be treated as specifying object fields (and a # type :
or similar sort of annotation indicating that it has a non-Any
static type). Or, the classdef could contain calls similar to Undefined
that specify its fields; like
class D:
x = Field(int)
def __init__(self):
self.x = 42
The advantage of the latter approach is that it lets programmers explicitly decide what the interface to their object is, rather inferring it.
(We currently use a third approach, specifying class/object fields through decorators, but that doesn't seem to fit with the style proposed in the PEP, and I'm not a big fan of it anymore either after seeing the better stuff you've come up with :)