Python Modules and Packages - An Introduction - Real Python
Python Modules and Packages - An Introduction - Real Python
Python Modules and Packages - An Introduction - Real Python
Table of Contents
Python Modules: Overview
The Module Search Path
The import Statement
import <module_name>
from <module_name> import <name(s)>
from <module_name> import <name> as <alt_name>
import <module_name> as <alt_name>
The dir() Function
Executing a Module as a Script
Reloading a Module
Python Packages
Package Initialization
Importing * From a Package
Subpackages
Conclusion
Remove ads
Watch Now This tutorial has a related video course created by the Real Python team. Watch it together with the
written tutorial to deepen your understanding: Python Modules and Packages: An Introduction
This article explores Python modules and Python packages, two mechanisms that facilitate modular programming.
Modular programming refers to the process of breaking a large, unwieldy programming task into separate, smaller,
more manageable subtasks or modules. Individual modules can then be cobbled together like building blocks to
create a larger application.
Simplicity: Rather than focusing on the entire problem at hand, a module typically focuses on one relatively
small portion of the problem. If you’re working on a single module, you’ll have a smaller problem domain to
wrap your head around. This makes development easier and less error-prone.
Maintainability: Modules are typically designed so that they enforce logical boundaries between different
problem domains. If modules are written in a way that minimizes interdependency, there is decreased likelihood
that modifications to a single module will have an impact on other parts of the program. (You may even be able
to make changes to a module without having any knowledge of the application outside that module.) This
makes it more viable for a team of many programmers to work collaboratively on a large application.
Reusability: Functionality defined in a single module can be easily reused (through an appropriately defined
interface) by other parts of the application. This eliminates the need to duplicate code.
Scoping: Modules typically define a separate namespace, which helps avoid collisions between identifiers in
different areas of a program. (One of the tenets in the Zen of Python is Namespaces are one honking great idea—
let’s do more of those!)
Functions, modules and packages are all constructs in Python that promote code modularization.
Take the Quiz: Test your knowledge with our interactive “Python Modules and Packages” quiz. You’ll receive a score upon
completion to help you track your learning progress:
Interactive Quiz
Python Modules and Packages
In this quiz, you'll test your understanding of Python modules and packages, which are mechanisms that
facilitate modular programming. Modular programming involves breaking a large programming task into
smaller, more manageable subtasks or modules. This approach has several advantages, including simplicity,
maintainability, and reusability.
A module’s contents are accessed the same way in all three cases: with the import statement.
Here, the focus will mostly be on modules that are written in Python. The cool thing about modules written in Python
is that they are exceedingly straightforward to build. All you need to do is create a file that contains legitimate Python
code and then give the file a name with a .py extension. That’s it! No special syntax is necessary.
For example, suppose you have created a file called mod.py containing the following:
mod.py
Python
s = "If Comrade Napoleon says it, it must be right."
a = [100, 200, 300]
def foo(arg):
print(f'arg = {arg}')
class Foo:
pass
s (a string)
a (a list)
foo() (a function)
Foo (a class)
Assuming mod.py is in an appropriate location, which you will learn more about shortly, these objects can be accessed
by importing the module as follows:
Python
Remove ads
Python
import mod
When the interpreter executes the above import statement, it searches for mod.py in a list of directories assembled
from the following sources:
The directory from which the input script was run or the current directory if the interpreter is being run
interactively
The list of directories contained in the PYTHONPATH environment variable, if it is set. (The format for PYTHONPATH is
OS-dependent but should mimic the PATH environment variable.)
An installation-dependent list of directories configured at the time Python is installed
The resulting search path is accessible in the Python variable sys.path, which is obtained from a module named sys:
Python
Thus, to ensure your module is found, you need to do one of the following:
Put mod.py in the directory where the input script is located or the current directory, if interactive
Modify the PYTHONPATH environment variable to contain the directory where mod.py is located before starting the
interpreter
Or: Put mod.py in one of the directories already contained in the PYTHONPATH variable
Put mod.py in one of the installation-dependent directories, which you may or may not have write-access to,
depending on the OS
There is actually one additional option: you can put the module file in any directory of your choice and then modify
sys.path at run-time so that it contains that directory. For example, in this case, you could put mod.py in directory
C:\Users\john and then issue the following statements:
Python
>>> sys.path.append(r'C:\Users\john')
>>> sys.path
['', 'C:\\Users\\john\\Documents\\Python\\doc', 'C:\\Python36\\Lib\\idlelib',
'C:\\Python36\\python36.zip', 'C:\\Python36\\DLLs', 'C:\\Python36\\lib',
'C:\\Python36', 'C:\\Python36\\lib\\site-packages', 'C:\\Users\\john']
>>> import mod
Once a module has been imported, you can determine the location where it was found with the module’s __file__
attribute:
Python
>>> import re
>>> re.__file__
'C:\\Python36\\lib\\re.py'
import <module_name>
The simplest form is the one already shown above:
Python
import <module_name>
Note that this does not make the module contents directly accessible to the caller. Each module has its own private
symbol table, which serves as the global symbol table for all objects defined in the module. Thus, a module creates a
separate namespace, as already noted.
The statement import <module_name> only places <module_name> in the caller’s symbol table. The objects that are
defined in the module remain in the module’s private symbol table.
From the caller, objects in the module are only accessible when prefixed with <module_name> via dot notation, as
illustrated below.
After the following import statement, mod is placed into the local symbol table. Thus, mod has meaning in the caller’s
local context:
Python
But s and foo remain in the module’s private symbol table and are not meaningful in the local context:
Python
>>> s
NameError: name 's' is not defined
>>> foo('quux')
NameError: name 'foo' is not defined
To be accessed in the local context, names of objects defined in the module must be prefixed by mod:
Python
>>> mod.s
'If Comrade Napoleon says it, it must be right.'
>>> mod.foo('quux')
arg = quux
Python
Remove ads
Python
Following execution of the above statement, <name(s)> can be referenced in the caller’s environment without the
<module_name> prefix:
Python
Because this form of import places the object names directly into the caller’s symbol table, any objects that already
exist with the same name will be overwritten:
Python
>>> a = ['foo', 'bar', 'baz']
>>> a
['foo', 'bar', 'baz']
It is even possible to indiscriminately import everything from a module at one fell swoop:
Python
This will place the names of all objects from <module_name> into the local symbol table, with the exception of any that
begin with the underscore (_) character.
For example:
Python
This isn’t necessarily recommended in large-scale production code. It’s a bit dangerous because you are entering
names into the local symbol table en masse. Unless you know them all well and can be confident there won’t be a
conflict, you have a decent chance of overwriting an existing name inadvertently. However, this syntax is quite handy
when you are just mucking around with the interactive interpreter, for testing or discovery purposes, because it
quickly gives you access to everything a module has to offer without a lot of typing.
Python
This makes it possible to place names directly into the local symbol table but avoid conflicts with previously existing
names:
Python
>>> s = 'foo'
>>> a = ['foo', 'bar', 'baz']
Python
Module contents can be imported from within a function definition. In that case, the import does not occur until the
function is called:
Python
>>> bar()
arg = corge
However, Python 3 does not allow the indiscriminate import * syntax from within a function:
Python
Lastly, a try statement with an except ImportError clause can be used to guard against unsuccessful import attempts:
Python
>>> try:
... # Non-existent module
... import baz
... except ImportError:
... print('Module not found')
...
Python
>>> try:
... # Existing module, but non-existent object
... from mod import baz
... except ImportError:
... print('Object not found in module')
...
Remove ads
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
Note how the first call to dir() above lists several names that are automatically defined and already in the namespace
when the interpreter starts. As new names are defined (qux, Bar, x), they appear on subsequent invocations of dir().
This can be useful for identifying what exactly has been added to the namespace by an import statement:
Python
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
When given an argument that is the name of a module, dir() lists the names defined in the module:
Python
Python
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> from mod import *
>>> dir()
['Foo', '__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'a', 'foo', 's']
mod.py
Python
def foo(arg):
print(f'arg = {arg}')
class Foo:
pass
C:\Users\john\Documents>python mod.py
C:\Users\john\Documents>
There are no errors, so it apparently worked. Granted, it’s not very interesting. As it is written, it only defines objects. It
doesn’t do anything with them, and it doesn’t generate any output.
Let’s modify the above Python module so it does generate some output when run as a script:
mod.py
Python
def foo(arg):
print(f'arg = {arg}')
class Foo:
pass
print(s)
print(a)
foo('quux')
x = Foo()
print(x)
C:\Users\john\Documents>python mod.py
If Comrade Napoleon says it, it must be right.
[100, 200, 300]
arg = quux
<__main__.Foo object at 0x02F101D0>
Unfortunately, now it also generates output when imported as a module:
Python
This is probably not what you want. It isn’t usual for a module to generate output when it is imported.
Wouldn’t it be nice if you could distinguish between when the file is loaded as a module and when it is run as a
standalone script?
When a .py file is imported as a module, Python sets the special dunder variable __name__ to the name of the module.
However, if a file is run as a standalone script, __name__ is (creatively) set to the string '__main__'. Using this fact, you
can discern which is the case at run-time and alter behavior accordingly:
mod.py
Python
def foo(arg):
print(f'arg = {arg}')
class Foo:
pass
if (__name__ == '__main__'):
print('Executing as standalone script')
print(s)
print(a)
foo('quux')
x = Foo()
print(x)
C:\Users\john\Documents>python mod.py
Executing as standalone script
If Comrade Napoleon says it, it must be right.
[100, 200, 300]
arg = quux
<__main__.Foo object at 0x03450690>
Python
Modules are often designed with the capability to run as a standalone script for purposes of testing the functionality
that is contained within the module. This is referred to as unit testing. For example, suppose you have created a
module fact.py containing a factorial function, as follows:
fact.py
Python
def fact(n):
return 1 if n == 1 else n * fact(n-1)
if (__name__ == '__main__'):
import sys
if len(sys.argv) > 1:
print(fact(int(sys.argv[1])))
The file can be treated as a module, and the fact() function imported:
Python
But it can also be run as a standalone by passing an integer argument on the command-line for testing:
C:\Users\john\Documents>python fact.py 6
720
Remove ads
Reloading a Module
For reasons of efficiency, a module is only loaded once per interpreter session. That is fine for function and class
definitions, which typically make up the bulk of a module’s contents. But a module can contain executable
statements as well, usually for initialization. Be aware that these statements will only be executed the first time a
module is imported.
mod.py
Python
Python
>>> mod.a
[100, 200, 300]
The print() statement is not executed on subsequent imports. (For that matter, neither is the assignment statement,
but as the final display of the value of mod.a shows, that doesn’t matter. Once the assignment is made, it sticks.)
If you make a change to a module and need to reload it, you need to either restart the interpreter or use a function
called reload() from module importlib:
Python
>>> import mod
a = [100, 200, 300]
Python Packages
Suppose you have developed a very large application that includes many modules. As the number of modules grows,
it becomes difficult to keep track of them all if they are dumped into one location. This is particularly so if they have
similar names or functionality. You might wish for a means of grouping and organizing them.
Packages allow for a hierarchical structuring of the module namespace using dot notation. In the same way that
modules help avoid collisions between global variable names, packages help avoid collisions between module
names.
Creating a package is quite straightforward, since it makes use of the operating system’s inherent hierarchical file
structure. Consider the following arrangement:
Here, there is a directory named pkg that contains two modules, mod1.py and mod2.py. The contents of the modules
are:
mod1.py
Python
def foo():
print('[mod1] foo()')
class Foo:
pass
mod2.py
Python
def bar():
print('[mod2] bar()')
class Bar:
pass
Given this structure, if the pkg directory resides in a location where it can be found (in one of the directories contained
in sys.path), you can refer to the two modules with dot notation (pkg.mod1, pkg.mod2) and import them with the
syntax you are already familiar with:
Python
Python
>>> import pkg.mod1, pkg.mod2
>>> pkg.mod1.foo()
[mod1] foo()
>>> x = pkg.mod2.Bar()
>>> x
<pkg.mod2.Bar object at 0x033F7290>
Python
Python
Python
Python
Python
Python
Python
But this is of little avail. Though this is, strictly speaking, a syntactically correct Python statement, it doesn’t do much
of anything useful. In particular, it does not place any of the modules in pkg into the local namespace:
Python
>>> pkg.mod1
Traceback (most recent call last):
File "<pyshell#34>", line 1, in <module>
pkg.mod1
AttributeError: module 'pkg' has no attribute 'mod1'
>>> pkg.mod1.foo()
Traceback (most recent call last):
File "<pyshell#35>", line 1, in <module>
pkg.mod1.foo()
AttributeError: module 'pkg' has no attribute 'mod1'
>>> pkg.mod2.Bar()
Traceback (most recent call last):
File "<pyshell#36>", line 1, in <module>
pkg.mod2.Bar()
AttributeError: module 'pkg' has no attribute 'mod2'
To actually import the modules or their contents, you need to use one of the forms shown above.
Remove ads
Package Initialization
If a file named __init__.py is present in a package directory, it is invoked when the package or a module in the
package is imported. This can be used for execution of package initialization code, such as initialization of package-
level data.
__init__.py
Python
Let’s add this file to the pkg directory from the above example:
Python
A module in the package can access the global variable by importing it in turn:
mod1.py
Python
def foo():
from pkg import A
print('[mod1] foo() / A = ', A)
class Foo:
pass
Python
__init__.py can also be used to effect automatic importing of modules from a package. For example, earlier you saw
that the statement import pkg only places the name pkg in the caller’s local symbol table and doesn’t import any
modules. But if __init__.py in the pkg directory contains the following:
__init__.py
Python
then when you execute import pkg, modules mod1 and mod2 are imported automatically:
Python
Note: Much of the Python documentation states that an __init__.py file must be present in the package
directory when creating a package. This was once true. It used to be that the very presence of __init__.py
signified to Python that a package was being defined. The file could contain initialization code or even be empty,
but it had to be present.
Starting with Python 3.3, Implicit Namespace Packages were introduced. These allow for the creation of a
package without any __init__.py file. Of course, it can still be present if package initialization is needed. But it is
no longer required. Check out What’s a Python Namespace Package, and What’s It For? to learn more.
There are now four modules defined in the pkg directory. Their contents are as shown below:
mod1.py
Python
def foo():
print('[mod1] foo()')
class Foo:
pass
mod2.py
Python
def bar():
print('[mod2] bar()')
class Bar:
pass
mod3.py
Python
def baz():
print('[mod3] baz()')
class Baz:
pass
mod4.py
Python
def qux():
print('[mod4] qux()')
class Qux:
pass
You have already seen that when import * is used for a module, all objects from the module are imported into the
local symbol table, except those whose names begin with an underscore, as always:
Python
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> dir()
['Baz', '__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__', 'baz']
>>> baz()
[mod3] baz()
>>> Baz
<class 'pkg.mod3.Baz'>
Python
Python
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
Hmph. Not much. You might have expected (assuming you had any expectations at all) that Python would dive down
into the package directory, find all the modules it could, and import them all. But as you can see, by default that is not
what happens.
Instead, Python follows this convention: if the __init__.py file in the package directory contains a list named __all__,
it is taken to be a list of modules that should be imported when the statement from <package_name> import * is
encountered.
For the present example, suppose you create an __init__.py in the pkg directory like this:
pkg/__init__.py
Python
__all__ = [
'mod1',
'mod2',
'mod3',
'mod4'
]
Python
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
Using import * still isn’t considered terrific form, any more for packages than for modules. But this facility at least
gives the creator of the package some control over what happens when import * is specified. (In fact, it provides the
capability to disallow it entirely, simply by declining to define __all__ at all. As you have seen, the default behavior for
packages is to import nothing.)
By the way, __all__ can be defined in a module as well and serves the same purpose: to control what is imported
with import *. For example, modify mod1.py as follows:
pkg/mod1.py
Python
__all__ = ['foo']
def foo():
print('[mod1] foo()')
class Foo:
pass
Now an import * statement from pkg.mod1 will only import what is contained in __all__:
Python
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__',
'__package__', '__spec__']
>>> foo()
[mod1] foo()
>>> Foo
Traceback (most recent call last):
File "<pyshell#37>", line 1, in <module>
Foo
NameError: name 'Foo' is not defined
foo() (the function) is now defined in the local namespace, but Foo (the class) is not, because the latter is not in
__all__.
In summary, __all__ is used by both packages and modules to control what is imported when import * is specified.
But the default behavior differs:
For a package, when __all__ is not defined, import * does not import anything.
For a module, when __all__ is not defined, import * imports everything (except—you guessed it—names
starting with an underscore).
Remove ads
Subpackages
Packages can contain nested subpackages to arbitrary depth. For example, let’s make one more modification to the
example package directory as follows:
The four modules (mod1.py, mod2.py, mod3.py and mod4.py) are defined as previously. But now, instead of being lumped
together into the pkg directory, they are split out into two subpackage directories, sub_pkg1 and sub_pkg2.
Importing still works the same as shown previously. Syntax is similar, but additional dot notation is used to separate
package name from subpackage name:
Python
>>> import pkg.sub_pkg1.mod1
>>> pkg.sub_pkg1.mod1.foo()
[mod1] foo()
In addition, a module in one subpackage can reference objects in a sibling subpackage (in the event that the sibling
contains some functionality that you need). For example, suppose you want to import and execute function foo()
(defined in module mod1) from within module mod3. You can either use an absolute import:
pkg/sub__pkg2/mod3.py
Python
def baz():
print('[mod3] baz()')
class Baz:
pass
Python
Or you can use a relative import, where .. refers to the package one level up. From within mod3.py, which is in
subpackage sub_pkg2,
pkg/sub__pkg2/mod3.py
Python
def baz():
print('[mod3] baz()')
class Baz:
pass
Python
This will hopefully allow you to better understand how to gain access to the functionality available in the many third-
party and built-in modules available in Python.
Additionally, if you are developing your own application, creating your own modules and packages will help you
organize and modularize your code, which makes coding, maintenance, and debugging easier.
If you want to learn more, check out the following documentation at Python.org:
Happy Pythoning!
Take the Quiz: Test your knowledge with our interactive “Python Modules and Packages” quiz. You’ll receive a score upon
completion to help you track your learning progress:
Interactive Quiz
Python Modules and Packages
In this quiz, you'll test your understanding of Python modules and packages, which are mechanisms that
facilitate modular programming. Modular programming involves breaking a large programming task into
smaller, more manageable subtasks or modules. This approach has several advantages, including simplicity,
maintainability, and reusability.
Watch Now This tutorial has a related video course created by the Real Python team. Watch it together with the
written tutorial to deepen your understanding: Python Modules and Packages: An Introduction
🐍 Python Tricks 💌
Get a short & sweet Python Trick delivered to your inbox every couple of
days. No spam ever. Unsubscribe any time. Curated by the Real Python
team.
Email Address
Send Me Python Tricks »
John is an avid Pythonista and a member of the Real Python tutorial team.
Each tutorial at Real Python is created by a team of developers so that it meets our high quality standards. The team members who
worked on this tutorial are:
What’s your #1 takeaway or favorite thing you learned? How are you going to put your newfound skills to use?
Leave a comment below and let us know.
Commenting Tips: The most useful comments are those written with the goal of learning from or helping
out other students. Get tips for asking good questions and get answers to common questions in our
support portal.
Looking for a real-time conversation? Visit the Real Python Community Chat or join the next “Office Hours”
Live Q&A Session. Happy Pythoning!
Keep Learning
Remove ads