Skip to content

Commit 8894a8f

Browse files
committed
Merge branch '3.0'
2 parents 2e682b9 + 613052a commit 8894a8f

File tree

10 files changed

+74
-73
lines changed

10 files changed

+74
-73
lines changed

docs/django/first-steps-with-django.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ alternatives to choose from, see :ref:`celerytut-broker`.
4343

4444
All settings mentioned in the Celery documentation should be added
4545
to your Django project's ``settings.py`` module. For example
46-
we can configure the :setting:`BROKER_URL` setting to specify
46+
you can configure the :setting:`BROKER_URL` setting to specify
4747
what broker to use::
4848

4949
BROKER_URL = 'amqp://guest:guest@localhost:5672/'
@@ -115,7 +115,7 @@ Calling our task
115115
================
116116

117117
Now that the worker is running, open up a new terminal to actually
118-
call the task we defined::
118+
call the task you defined::
119119

120120
>>> from celerytest.tasks import add
121121

docs/getting-started/first-steps-with-celery.rst

Lines changed: 10 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -127,7 +127,7 @@ application or just app in short. Since this instance is used as
127127
the entry-point for everything you want to do in Celery, like creating tasks and
128128
managing workers, it must be possible for other modules to import it.
129129

130-
In this tutorial we will keep everything contained in a single module,
130+
In this tutorial you will keep everything contained in a single module,
131131
but for larger projects you want to create
132132
a :ref:`dedicated module <project-layout>`.
133133

@@ -146,22 +146,19 @@ Let's create the file :file:`tasks.py`:
146146
The first argument to :class:`~celery.app.Celery` is the name of the current module,
147147
this is needed so that names can be automatically generated, the second
148148
argument is the broker keyword argument which specifies the URL of the
149-
message broker we want to use.
150-
151-
The broker argument specifies the URL of the broker we want to use,
152-
we use RabbitMQ here, which is already the default option,
153-
but see :ref:`celerytut-broker` above if you want to use something different,
149+
message broker you want to use, using RabbitMQ here, which is already the
150+
default option. See :ref:`celerytut-broker` above for more choices,
154151
e.g. for Redis you can use ``redis://localhost``, or MongoDB:
155152
``mongodb://localhost``.
156153

157-
We defined a single task, called ``add``, which returns the sum of two numbers.
154+
You defined a single task, called ``add``, which returns the sum of two numbers.
158155

159156
.. _celerytut-running-celeryd:
160157

161158
Running the celery worker server
162159
================================
163160

164-
We now run the worker by executing our program with the ``worker``
161+
You now run the worker by executing our program with the ``worker``
165162
argument:
166163

167164
.. code-block:: bash
@@ -192,7 +189,7 @@ There also several other commands available, and help is also available:
192189
Calling the task
193190
================
194191

195-
To call our task we can use the :meth:`~@Task.delay` method.
192+
To call our task you can use the :meth:`~@Task.delay` method.
196193

197194
This is a handy shortcut to the :meth:`~@Task.apply_async`
198195
method which gives greater control of the task execution (see
@@ -225,7 +222,7 @@ built-in result backends to choose from: `SQLAlchemy`_/`Django`_ ORM,
225222
.. _`SQLAlchemy`: http://www.sqlalchemy.org/
226223
.. _`Django`: http://djangoproject.com
227224

228-
For this example we will use the `amqp` result backend, which sends states
225+
For this example you will use the `amqp` result backend, which sends states
229226
as messages. The backend is specified via the ``backend`` argument to
230227
:class:`@Celery`, (or via the :setting:`CELERY_RESULT_BACKEND` setting if
231228
you choose to use a configuration module)::
@@ -240,7 +237,7 @@ the message broker (a popular combination)::
240237
To read more about result backends please see :ref:`task-result-backends`.
241238

242239
Now with the result backend configured, let's call the task again.
243-
This time we'll hold on to the :class:`~@AsyncResult` instance returned
240+
This time you'll hold on to the :class:`~@AsyncResult` instance returned
244241
when you call a task::
245242

246243
>>> result = add.delay(4, 4)
@@ -251,7 +248,7 @@ has finished processing or not::
251248
>>> result.ready()
252249
False
253250

254-
We can wait for the result to complete, but this is rarely used
251+
You can wait for the result to complete, but this is rarely used
255252
since it turns the asynchronous call into a synchronous one::
256253

257254
>>> result.get(timeout=1)
@@ -264,7 +261,7 @@ the ``propagate`` argument::
264261
>>> result.get(propagate=True)
265262

266263

267-
If the task raised an exception we can also gain access to the
264+
If the task raised an exception you can also gain access to the
268265
original traceback::
269266

270267
>>> result.traceback

docs/getting-started/next-steps.rst

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
============
66

77
The :ref:`first-steps` guide is intentionally minimal. In this guide
8-
we will demonstrate what Celery offers in more detail, including
8+
I will demonstrate what Celery offers in more detail, including
99
how to add Celery support for your application and library.
1010

1111
This document does not document all of Celery's features and
@@ -36,7 +36,7 @@ Project layout::
3636
.. literalinclude:: ../../examples/next-steps/proj/celery.py
3737
:language: python
3838

39-
In this module we created our :class:`@Celery` instance (sometimes
39+
In this module you created our :class:`@Celery` instance (sometimes
4040
referred to as the *app*). To use Celery within your project
4141
you simply import this instance.
4242

@@ -47,17 +47,17 @@ you simply import this instance.
4747
- The ``backend`` argument specifies the result backend to use,
4848

4949
It's used to keep track of task state and results.
50-
While results are disabled by default we use the amqp backend here
51-
to demonstrate how retrieving the results work, you may want to use
52-
a different backend for your application, as they all have different
53-
strengths and weaknesses. If you don't need results it's best
50+
While results are disabled by default I use the amqp backend here
51+
because I demonstrate how retrieving results work later, you may want to use
52+
a different backend for your application. They all have different
53+
strengths and weaknesses. If you don't need results it's better
5454
to disable them. Results can also be disabled for individual tasks
5555
by setting the ``@task(ignore_result=True)`` option.
5656

5757
See :ref:`celerytut-keeping-results` for more information.
5858

5959
- The ``include`` argument is a list of modules to import when
60-
the worker starts. We need to add our tasks module here so
60+
the worker starts. You need to add our tasks module here so
6161
that the worker is able to find our tasks.
6262

6363
:file:`proj/tasks.py`
@@ -275,9 +275,9 @@ backend that suits every application, so to choose one you need to consider
275275
the drawbacks of each individual backend. For many tasks
276276
keeping the return value isn't even very useful, so it's a sensible default to
277277
have. Also note that result backends are not used for monitoring tasks and workers,
278-
for that we use dedicated event messages (see :ref:`guide-monitoring`).
278+
for that Celery uses dedicated event messages (see :ref:`guide-monitoring`).
279279

280-
If you have a result backend configured we can retrieve the return
280+
If you have a result backend configured you can retrieve the return
281281
value of a task::
282282

283283
>>> res = add.delay(2, 2)
@@ -289,7 +289,7 @@ You can find the task's id by looking at the :attr:`id` attribute::
289289
>>> res.id
290290
d6b3aea2-fb9b-4ebc-8da4-848818db9114
291291

292-
We can also inspect the exception and traceback if the task raised an
292+
You can also inspect the exception and traceback if the task raised an
293293
exception, in fact ``result.get()`` will propagate any errors by default::
294294

295295
>>> res = add.delay(2)
@@ -359,7 +359,7 @@ Calling tasks is described in detail in the
359359
*Canvas*: Designing Workflows
360360
=============================
361361

362-
We just learned how to call a task using the tasks ``delay`` method,
362+
You just learned how to call a task using the tasks ``delay`` method,
363363
and this is often all you need, but sometimes you may want to pass the
364364
signature of a task invocation to another process or as an argument to another
365365
function, for this Celery uses something called *subtasks*.
@@ -408,7 +408,7 @@ and this can be resolved when calling the subtask::
408408
>>> res.get()
409409
10
410410

411-
Here we added the argument 8, which was prepended to the existing argument 2
411+
Here you added the argument 8, which was prepended to the existing argument 2
412412
forming a complete signature of ``add(8, 2)``.
413413

414414
Keyword arguments can also be added later, these are then merged with any
@@ -430,8 +430,8 @@ As stated subtasks supports the calling API, which means that:
430430
to the arguments in the signature, and keyword arguments is merged with any
431431
existing keys.
432432

433-
So this all seems very useful, but what can we actually do with these?
434-
To get to that we must introduce the canvas primitives...
433+
So this all seems very useful, but what can you actually do with these?
434+
To get to that I must introduce the canvas primitives...
435435

436436
The Primitives
437437
--------------

docs/userguide/application.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ Whenever you define a task, that task will also be added to the local registry:
5858
>>> celery.tasks['__main__.add']
5959
<@task: __main__.add>
6060
61-
and there we see that ``__main__`` again; whenever Celery is not able
61+
and there you see that ``__main__`` again; whenever Celery is not able
6262
to detect what module the function belongs to, it uses the main module
6363
name to generate the beginning of the task name.
6464

@@ -254,7 +254,7 @@ of the task to happen either when the task is used, or after the
254254
application has been *finalized*,
255255

256256
This example shows how the task is not created until
257-
we use the task, or access an attribute (in this case :meth:`repr`):
257+
you use the task, or access an attribute (in this case :meth:`repr`):
258258

259259
.. code-block:: python
260260
@@ -329,7 +329,7 @@ While it's possible to depend on the current app
329329
being set, the best practice is to always pass the app instance
330330
around to anything that needs it.
331331

332-
We call this the "app chain", since it creates a chain
332+
I call this the "app chain", since it creates a chain
333333
of instances depending on the app being passed.
334334

335335
The following example is considered bad practice:

docs/userguide/calling.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ function:
6666
6767
task.delay(arg1, arg2, kwarg1='x', kwarg2='y')
6868
69-
Using :meth:`~@Task.apply_async` instead we have to write:
69+
Using :meth:`~@Task.apply_async` instead you have to write:
7070

7171
.. code-block:: python
7272
@@ -118,7 +118,7 @@ as a partial argument:
118118
119119
.. sidebar:: What is ``s``?
120120

121-
The ``add.s`` call used here is called a subtask, we talk
121+
The ``add.s`` call used here is called a subtask, I talk
122122
more about subtasks in the :ref:`canvas guide <guide-canvas>`,
123123
where you can also learn about :class:`~celery.chain`, which
124124
is a simpler way to chain tasks together.

docs/userguide/canvas.rst

Lines changed: 14 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -15,10 +15,11 @@ Subtasks
1515

1616
.. versionadded:: 2.0
1717

18-
We just learned how to call a task using the tasks ``delay`` method,
19-
and this is often all you need, but sometimes you may want to pass the
20-
signature of a task invocation to another process or as an argument to another
21-
function, for this Celery uses something called *subtasks*.
18+
You just learned how to call a task using the tasks ``delay`` method
19+
in the :ref:`calling <calling>` guide, and this is often all you need,
20+
but sometimes you may want to pass the signature of a task invocation to
21+
another process or as an argument to another function, for this Celery uses
22+
something called *subtasks*.
2223

2324
A :func:`~celery.subtask` wraps the arguments, keyword arguments, and execution options
2425
of a single task invocation in a way such that it can be passed to functions
@@ -48,7 +49,7 @@ or even serialized and sent across the wire.
4849
>>> add.s(2, 2, debug=True)
4950
tasks.add(2, 2, debug=True)
5051

51-
- From any subtask instance we can inspect the different fields::
52+
- From any subtask instance you can inspect the different fields::
5253

5354
>>> s = add.subtask((2, 2), {'debug': True}, countdown=10)
5455
>>> s.args
@@ -133,7 +134,7 @@ so it's not possible to call the subtask with partial args/kwargs.
133134

134135
.. note::
135136

136-
In this tutorial we sometimes use the prefix operator `~` to subtasks.
137+
In this tutorial I sometimes use the prefix operator `~` to subtasks.
137138
You probably shouldn't use it in your production code, but it's a handy shortcut
138139
when experimenting in the Python shell::
139140

@@ -158,7 +159,7 @@ to ``apply_async``::
158159
The callback will only be applied if the task exited successfully,
159160
and it will be applied with the return value of the parent task as argument.
160161

161-
As we mentioned earlier, any arguments you add to `subtask`,
162+
As I mentioned earlier, any arguments you add to `subtask`,
162163
will be prepended to the arguments specified by the subtask itself!
163164

164165
If you have the subtask::
@@ -255,7 +256,7 @@ Here's some examples:
255256

256257
- Immutable subtasks
257258

258-
As we have learned signatures can be partial, so that arguments can be
259+
Signatures can be partial so arguments can be
259260
added to the existing arguments, but you may not always want that,
260261
for example if you don't want the result of the previous task in a chain.
261262

@@ -268,7 +269,7 @@ Here's some examples:
268269

269270
>>> add.si(2, 2)
270271

271-
Now we can create a chain of independent tasks instead::
272+
Now you can create a chain of independent tasks instead::
272273

273274
>>> res = (add.si(2, 2), add.si(4, 4), add.s(8, 8))()
274275
>>> res.get()
@@ -282,7 +283,7 @@ Here's some examples:
282283

283284
- Simple group
284285

285-
We can easily create a group of tasks to execute in parallel::
286+
You can easily create a group of tasks to execute in parallel::
286287

287288
>>> from celery import group
288289
>>> res = group(add.s(i, i) for i in xrange(10))()
@@ -305,7 +306,7 @@ Here's some examples:
305306
>>> g = group(add.s(i, i) for i in xrange(10))
306307
>>> g.apply_async()
307308

308-
This is useful because we can e.g. specify a time for the
309+
This is useful because you can e.g. specify a time for the
309310
messages in the group to be called::
310311

311312
>>> g.apply_async(countdown=10)
@@ -673,7 +674,7 @@ finished executing.
673674
Let's calculate the sum of the expression
674675
:math:`1 + 1 + 2 + 2 + 3 + 3 ... n + n` up to a hundred digits.
675676

676-
First we need two tasks, :func:`add` and :func:`tsum` (:func:`sum` is
677+
First you need two tasks, :func:`add` and :func:`tsum` (:func:`sum` is
677678
already a standard function):
678679

679680
.. code-block:: python
@@ -687,7 +688,7 @@ already a standard function):
687688
return sum(numbers)
688689
689690
690-
Now we can use a chord to calculate each addition step in parallel, and then
691+
Now you can use a chord to calculate each addition step in parallel, and then
691692
get the sum of the resulting numbers::
692693

693694
>>> from celery import chord

docs/userguide/monitoring.rst

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -316,7 +316,9 @@ By default monitor data for successful tasks will expire in 1 day,
316316
failed tasks in 3 days and pending tasks in 5 days.
317317

318318
You can change the expiry times for each of these using
319-
adding the following settings to your :file:`settings.py`::
319+
adding the following settings to your :file:`settings.py`:
320+
321+
.. code-block:: python
320322
321323
from datetime import timedelta
322324
@@ -588,7 +590,7 @@ Even a single worker can produce a huge amount of events, so storing
588590
the history of all events on disk may be very expensive.
589591

590592
A sequence of events describes the cluster state in that time period,
591-
by taking periodic snapshots of this state we can keep all history, but
593+
by taking periodic snapshots of this state you can keep all history, but
592594
still only periodically write it to disk.
593595

594596
To take snapshots you need a Camera class, with this you can define
@@ -638,7 +640,7 @@ See the API reference for :mod:`celery.events.state` to read more
638640
about state objects.
639641

640642
Now you can use this cam with :program:`celery events` by specifying
641-
it with the `-c` option:
643+
it with the :option:`-c` option:
642644

643645
.. code-block:: bash
644646
@@ -667,7 +669,7 @@ Or you can use it programmatically like this:
667669
Real-time processing
668670
--------------------
669671

670-
To process events in real-time we need the following
672+
To process events in real-time you need the following
671673

672674
- An event consumer (this is the ``Receiver``)
673675

@@ -686,8 +688,7 @@ To process events in real-time we need the following
686688
together as events come in, making sure timestamps are in sync, and so on.
687689

688690

689-
690-
Combining these we can easily process events in real-time:
691+
Combining these you can easily process events in real-time:
691692

692693

693694
.. code-block:: python
@@ -714,11 +715,11 @@ Combining these we can easily process events in real-time:
714715
.. note::
715716
716717
The wakeup argument to ``capture`` sends a signal to all workers
717-
to force them to send a heartbeat. This way we can immediately see
718+
to force them to send a heartbeat. This way you can immediately see
718719
workers when the monitor starts.
719720
720721
721-
We can listen to specific events by specifying the handlers:
722+
You can listen to specific events by specifying the handlers:
722723
723724
.. code-block:: python
724725

0 commit comments

Comments
 (0)