Skip to content

RFC: Cooperative concurrency model of choice for MicroPython #242

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
pfalcon opened this issue Jan 30, 2014 · 11 comments
Closed

RFC: Cooperative concurrency model of choice for MicroPython #242

pfalcon opened this issue Jan 30, 2014 · 11 comments
Labels
rfc Request for Comment

Comments

@pfalcon
Copy link
Contributor

pfalcon commented Jan 30, 2014

This questions was raised yet in comments to Kickstarter campaign, and I'd like to set up this ticket for informal background discussion, to allow interested parties to further discuss/share ideas on this matter, and serve as guide for further implementation directions.

Summary of my proposal: Choose asynchronous concurrency via generators, as native, Pythonic, high-level way to do concurrency in Python.

I guess almost everyone heard that Python generators can be used to implement coroutines, and possibly few actually use them and up to date with their status. Myself neither, so I was taking last couple of weeks to survey state of the art in Python async concurrency in general, and generators in particular. There appear to be gradual advances since 2.5, and process continues, with upcoming 3.4 offering standardized async coroutine framework.

Formal references:

  1. http://www.python.org/dev/peps/pep-0342/ "Coroutines via Enhanced Generators"
  2. http://www.python.org/dev/peps/pep-0380/ "Syntax for Delegating to a Subgenerator"
  3. http://www.python.org/dev/peps/pep-3156/ "Asynchronous IO Support Rebooted: the "asyncio" Module"

Tutorials:
4. http://www.dabeaz.com/coroutines/index.html "A Curious Course on Coroutines and Concurrency"
5. http://lgiordani.github.io/blog/2013/03/29/python-generators-from-iterators-to-cooperative-multitasking-3/ "Python Generators - From Iterators to Cooperative Multitasking"

Reviews and comparisons:
6. http://nichol.as/asynchronous-servers-in-python "Asynchronous Servers in Python"

@Neon22
Copy link
Contributor

Neon22 commented Jan 30, 2014

In my brief evaluation of this substantial info. It looks to me like:
Generators aren't quite ready to be the soln for async concurrency. But are looking promising for future.
Having said that Diesal web is a legitimate solution which exists and works this way now.
The other approach appears to be Greenlet based - and is also working and performing well.

Of course moost of these discussions center around webservers (as they want concurrent socket based ops right now) rather than the focus on async processes which we would like in an embedded approach.
I need to read more...

http://diesel.io/
https://pypi.python.org/pypi/greenlet

@dpgeorge
Copy link
Member

I saw Protothreads and the description sounded good, but it may be too simplistic to get uPy functions running asynchronously.

http://dunkels.com/adam/pt/

@pfalcon
Copy link
Contributor Author

pfalcon commented Jan 30, 2014

@dpgeorge: Well, protothreads are inferior analogue of python generators (which are both kinds of coroutines). We don't need them in uPy, they're needed when you have 1K of flash and 128b of RAM ;-). What you already implemented of generators is more advanced and flexible than protothreads.

Btw, I thought you had generators destined as concurrency model yourself, because I'd personally implemented generator support the last thing otherwise ;-).

@KeithJRome
Copy link

FWIW, async generators have been picking up momentum in recent years in more mature mainstream frameworks - for example the relatively recent introduction of it into the core c# language, which is now the recommended approach on that platform.

My point being that developers are more accepting of the approach today than they would have been several years ago.

Personally, I like them.

@dpgeorge
Copy link
Member

Hah, well, I just wanted to get as Python-conformant as I could, and generators didn't need much extra work, so they are there.

I'll have to learn more about how to use them as async concurrency.

Do you think it's enough to have concurrency at the Python high-level, and not at the C level? It's much easier at the Python level (since it's basically done).

To get it working, is there more to implement in the core of uPy, or is it just a pure Python library that needs to be ported?

@pfalcon
Copy link
Contributor Author

pfalcon commented Jan 31, 2014

It's probably worth getting terminology straight, because there's duality between synchronous/asynchronous stuff in multitasking framework. With (typical implementation of) cooperative multitasking we want to handle async events/processes with sync event loop. And with preemptive multitasking, we want to have synchronous processes scheduled and running asynchronously.

So, the talk here about cooperative multitasking model, I updated the title. This can be implemented in pure Python. I don't even raise question of preemptive multitasking. First of all, I don't think that's useful for MCU usage. And that surely would require RTOS, enough memory for thread stacks, etc. Thread-based multitasking would be more useful for unix port, even though best practice would be to implement thread pool to adapt blocking sync services into cooperative async ones. But even that would require either pseudo-threadsafety with locks (oh no, we had enough of GIL!) or doing true "lockless" threadsafety by supporting multiple VMs (Squirrel does that, each and every API function takes VM pointer, how do you like that?)

To get it working, is there more to implement in the core of uPy, or is it just a pure Python library that needs to be ported?

So, I submitted this informal ticket essentially as a prelude to #243. Trivial "embedded" coop concurrency trivially doable with py2.3 generators:

queue = []
def task1(param):
    while 1:
        queue.append(param)
        yield
def task2():
    while 1:
        while queue:
            print(queue.pop(0))
        yield
tasks = [task1(10), task1(20), task2()]
while 1: for t in tasks: next(t)

To add seamless async i/o (which looks like "system service", though actually implemented by the similar coroutine as any other task), it takes 2.5 .send() method, implemented by me recently. But to make it real-world usable, exception handling needs to be implemented, which is concern of #243.

@pfalcon
Copy link
Contributor Author

pfalcon commented Jan 31, 2014

I'll have to learn more about how to use them as async concurrency.

I recommend David Beazley's tutorial (# 4 above), it in particular clearly addresses @Neon22's concern that "they center around webservers rather than the focus on async processes which we would like in an embedded approach." - no, there's no difference, they center around arbitrary async events/processes and there's elegant way to plug anything and everything.

@pfalcon
Copy link
Contributor Author

pfalcon commented Mar 22, 2014

Ok, Python 3.4 was released, featuring asyncio module: http://docs.python.org/3.4/library/asyncio.html . Googling around, it's getting its critical acclaim. So, it's no longer a question, it's the fact: cooperative concurrency model of choice for Python is coroutines via generators.

But asyncio is based on "yield from" construct, which itself appeared only in py3.3. uPy doesn't have it implemented so far. So, what is "yield from" and why it was selected as the base for async implementation? Explanation directly from Guido van Rossum (Python author): https://groups.google.com/forum/#!msg/python-tulip/bmphRrryuFk/aB45sEJUomYJ

@dpgeorge
Copy link
Member

Nice read from Guido about yield from. Now we just need to implement it... :)

@pfalcon
Copy link
Contributor Author

pfalcon commented Mar 23, 2014

Yeah, opened #366 for that, because it's not clear at all how it should be implemented ;-).

@pfalcon
Copy link
Contributor Author

pfalcon commented May 10, 2014

Current state:

  1. There's a branch for asyncio-like API in: https://github.com/micropython/micropython-lib/tree/asyncio
  2. It's "asyncio-like" because I found that asyncio design decisions don't fit well with those of MicroPython, uPy's primary goal is to stay lean and efficient while asyncio's goal is to provide interoperatibility framework for existing async libraries, more info: https://groups.google.com/forum/#!topic/python-tulip/zfMQIUcIR-0
  3. There's now a thread on forum to continue discussion: http://forum.micropython.org/viewtopic.php?f=3&t=85

Closing this ticket thus.

@pfalcon pfalcon closed this as completed May 10, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
rfc Request for Comment
Projects
None yet
Development

No branches or pull requests

4 participants