Replies: 7 comments 2 replies
-
I like this idea. Quoting the Python Zen:
Another complication would be to make this work seamlessly between MicroPython and PyOdide (as both orchestrator and workers). |
Beta Was this translation helpful? Give feedback.
-
feels like a mix of SharedWorker and PyScript donkey capabilities ... we investigated the former approach and it's mostly a no-go because requires that file to be hosted where the user wants to have these features, we cannot inject a SharedWorker on our end due Web constraints + the donkey allows you already to spwan as many workers as you want providing a local/global state to ... but there is no orchestration to synchronize those, although a leading worker could do the trick. That being said, this feels like a lot of work for zero provided use case + if WASM doesn't have it I believe there are tons of valid reasons or limitations on the Web, hence they don't have it? This is also not currently part of my key goals so it would need coordination with the rest of the team as this would be huge effort, imho, with tons of unknowns in the making. /cc @ntoll |
Beta Was this translation helpful? Give feedback.
-
This is an interesting case, that's also a good example of why PyScript is such a fascinating project. Clearly, multiprocessing is the "blessed" CPython way to achieve a certain end. Sadly, there's no equivalent in MicroPython. Furthermore, the web world also has a similar thing in the form of web workers that, funnily enough, I often describe as isolated sub-processes of your web page, to Pythonistas who may not yet grok the browser. All this to say that we already have the capabilities of multiprocessing via web workers. I'm with @WebReflection in that we're balancing something Pythonic (multiprocessing) with the browser (web workers) and it's already a solved problem. Perhaps the answer is really, "in idiomatic PyScript we use the donkey or built-in worker support"..? |
Beta Was this translation helpful? Give feedback.
-
Honestly, I never bothered to investigate what "donkey" means. I am just
using the xworker and sync APIs.
…On Mon, Mar 31, 2025, 7:22 PM Nicholas Tollervey ***@***.***> wrote:
This is an interesting case, that's also a good example of why PyScript is
such a fascinating project.
Clearly, multiprocessing is the "blessed" CPython way to achieve a certain
end. Sadly, there's no equivalent in MicroPython. Furthermore, the web
world also has a similar thing in the form of web workers that, funnily
enough, I often describe as isolated sub-processes of your web page, to
Pythonistas who may not yet grok the browser. All this to say that we
already have the capabilities of multiprocessing via web workers. I'm with
@WebReflection <https://github.com/WebReflection> in that we're balancing
something Pythonic (multiprocessing) with the browser (web workers) and
it's already a solved problem.
Perhaps the answer is really, "in idiomatic PyScript we use the donkey or
built-in worker support"..?
—
Reply to this email directly, view it on GitHub
<#2321 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AE4XAKFTLJS5NHANZCQDZD32XF2TZAVCNFSM6AAAAAB2D2ODF2VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTENRXHE2DINQ>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
So:
So I think I'm correct if I say:
not sure how to best characterise the use cases for other approaches. (I continue to work on ltk 05 starter project (workers) whch is in a PR to the examples github repo. |
Beta Was this translation helpful? Give feedback.
-
So maybe the relevant aspect of his should be moved to docs ?as well as #2315 |
Beta Was this translation helpful? Give feedback.
-
My 2¢ as an user: I'd never expect `multiprocessing` to work on PyScript, I
would never think of using it because of how far removed PyScript is from
the whole idea of spawning OS processes!
But, in the same way that `threading` and `multiprocessing` share a core
API that is very similar and even interchangeable in simple use cases, I'd
like to see a module offering that same minimal API to run tasks on web
workers.
Also, it would be cool to have a `WorkerPoolExecutor` class similar to the
`ThreadPoolExecutor` and `ProcessPoolExecutor`. These (1) are easy to use
in many cases and (2) integrate with `async/await` when the codebase is
mostly async.
Cheers,
Luciano
…On Wed, Apr 2, 2025 at 5:24 PM Neon22 ***@***.***> wrote:
So maybe the relevant aspect of his should be moved to docs ?as well as
#2315 <#2315>
—
Reply to this email directly, view it on GitHub
<#2321 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAHJ6H52KZCP6GATI5KMD32XRIPRAVCNFSM6AAAAAB2D2ODF2VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTENZQGYYTKNY>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
--
Luciano Ramalho
| Author of Fluent Python (O'Reilly, 2015-2022)
| https://www.oreilly.com/library/view/fluent-python-2nd/9781492056348/
| Mastodon: @***@***.***
|
Beta Was this translation helpful? Give feedback.
-
As the new lib/multiprocessing will NOT be available for WASM.
Is it a good idea to implement it in pyscript and have workers do the job.
This would have the advantage of abstracting away all worker specific setup,comms,sync etc.
But worker config and async/await issues could be difficult.
As someone else suggested - maybe there's an intern somewhere who wants to contribute... :)
Beta Was this translation helpful? Give feedback.
All reactions