-
-
Notifications
You must be signed in to change notification settings - Fork 9.6k
StreamedJsonResponse does not stream #60257
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hey @macropay-solutions, AFAIK, the If you're in that second case, you might want to take a look at the upcoming and experimental JsonStreamer component, which will be released in Symfony 7.3 |
@mtarld we tried giving it directly the iterator (LazyCollection), not an array with an iterator in it. In both cases we received memory error on 3.7 mil rows for example. Update Description of the behaviour: Update: With the LazyColection as first param the result looks like this: [
{
"id": 2000,
"parent_id": null,
"client_id": 49507,
"currency": "EUR",
"value": "23.00",
"created_at": "2024-01-17 10:05:07",
"updated_at": null,
"primary_key_identifier": "2000"
},...
] |
I think you're right, it might be something here 🙂 |
@mtarld Nope. We tried also with flush and ob_flush. If you read through the related issue from laravel, you will see why. |
Is JS capable of decoding partial json strings if the json is sent like:
then
then
? |
But on our implementation that echoes a json on a new row, and after each echo, we ob_flush and flush, in 20k rows about 300-400 rowa are split while sent, so, the above question about js becomes: Is JS capable of decoding partial json strings if the json is sent like:
then
then
? |
I'll try to dig it out soon, can you in the meantime create and share a reproducer so we are sure we are talking about the same thing? |
We have little experience with symfony, this is a laravel example: return new \Symfony\Component\HttpFoundation\StreamedJsonResponse(
Operation::query()->with('client')->lazyByIdDesc(1000, 'id'),
200,
[],
JSON_UNESCAPED_UNICODE | JSON_UNESCAPED_SLASHES
); Can be seen in action here https://laravel-crud-wizard.com/laravel-10/laravel-lumen-crud-wizard#operations Put in textarea
and submit => a prompt to download the json file will appear with a delay (until it builds it in memory). The default condition is id < 20000 to not give memory error. To compare with less rows in response use in textarea:
This will be almost instant returning [
{
"id": 9,
"parent_id": 3,
"client_id": 177601,
"currency": "EUR",
"value": "75.00",
"created_at": "2024-01-17 10:05:04",
"updated_at": "2025-04-17 09:06:48",
"primary_key_identifier": "9"
},
{
"id": 8,
"parent_id": null,
"client_id": 45015,
"currency": "EUR",
"value": "43.00",
"created_at": "2024-01-17 10:05:04",
"updated_at": "2025-04-17 09:15:20",
"primary_key_identifier": "8"
},
{
"id": 7,
"parent_id": 3,
"client_id": 126362,
"currency": "EUR",
"value": "95.00",
"created_at": "2024-01-17 10:05:04",
"updated_at": "2025-04-16 16:39:25",
"primary_key_identifier": "7"
},
{
"id": 6,
"parent_id": 2,
"client_id": 84224,
"currency": "EUR",
"value": "97.00",
"created_at": "2024-01-17 10:05:04",
"updated_at": "2025-04-16 13:47:33",
"primary_key_identifier": "6"
},
{
"id": 5,
"parent_id": 2,
"client_id": 97396,
"currency": "EUR",
"value": "79.00",
"created_at": "2024-01-17 10:05:04",
"updated_at": "2025-04-16 15:32:53",
"primary_key_identifier": "5"
},
{
"id": 4,
"parent_id": 2,
"client_id": 165915,
"currency": "EUR",
"value": "89.00",
"created_at": "2024-01-17 10:05:04",
"updated_at": "2025-04-16 13:33:50",
"primary_key_identifier": "4"
},
{
"id": 1,
"parent_id": null,
"client_id": 1,
"currency": "EUR",
"value": "10.00",
"created_at": "2024-01-03 19:39:02",
"updated_at": "2025-04-16 13:51:10",
"primary_key_identifier": "1"
}
] For normal json response (and executed queries) for the above last situation use:
to see only the count or
to see all in normal json response. |
As stated here, flushing is your responsibility. Therefore, have you tried something like: $iterateAndFlush = static function (iterable $iterable): iterable
{
$i = 0;
foreach ($iterable as $item) {
yield $item;
if ($i && $i % 200 === 0) {
ob_flush();
flush();
}
++$i;
}
}
return new \Symfony\Component\HttpFoundation\StreamedJsonResponse(
$iterateAndFlush(Operation::query()->with('client')->lazyByIdDesc(1000, 'id')),
200,
[],
JSON_UNESCAPED_UNICODE | JSON_UNESCAPED_SLASHES
) ? |
|
Can you share me that piece of code? Maybe something going on here |
We edited the local vendor file at this line putting the ob_flush and flush |
We added ob_flush();flush(); in the demo page at that line also now.
|
@mtarld the whole idea of streaming a json was started wrong. From BE point of view you avoid memory issues if it would work and that would be all the advantage to it... Our solution (demo) sends each row from DB as a separate json on a new line, making it possible for JS to decode and display it immediately without waiting for the whole stream to end. Dumb example: If you want to see a movie. Current StreamedJsonResponse implementation as a logic, limits you from FE (and BE as it is coded now) to view it until you download it all. |
Not all API consumers are implemented in Javascript. Some other languages have streaming decoders for JSON (and there might be userland streaming decoders in JS as well btw) |
this is not a JSON response then, but a JSONND response, which is a different content type. |
Btw, Symfony already supports returning JSONND responses in a streaming way (since years) by using the StreamedResponse (as you can use a normal |
@stof Our solution extends StreamedResponse but we don't put types on keys like
So, it is not JSONND.
UPDATE Acc to https://en.m.wikipedia.org/wiki/JSON_streaming
Acc to https://github.com/glenkleidon/JSON-ND
Based on this, we could use these MIME types: |
@mtarld we deployed your changes from here in our demo page. Still memory error for 1 mil rows.
We rolled it back to 20000 rows from 1 mil and to the original StreamedJsonResponse. This is how we call it
|
I am afraid that we will not be able to help you without an application that easily allows to reproduce your issue. Otherwise you will probably have to debug a possible solution yourself. |
@xabbuh Thank you. The situation is the other way around. We offered symfony our help by rising this issue and providing an alternative. The docs could be changed to point this out. |
Reporting a bug is helpful for the project for sure, yet the bug reported here depends on multiple factors including ones that are out of Symfony's control (LazyCollection from Laravel and your own code). Hence we ask you to provide code one can run to reproduce the bug, in order to make your report even more helpful as it'd make us' able to confirm the bug and fix it eventually. |
@chalasr we don't have a symfony dev to create a symfony demo project that reproduces this atm. |
Note that any php project would work, even a laravel one or a vanilla script. |
@chalasr you can use this then https://github.com/macropay-solutions/laravel-crud-wizard-decorator-free-demo/tree/StreamedJsonResponse_not_streaming create .env from .env example run: composer install Call GET /api/operations?limit=-1 |
Hi,
We implemented a StreamedJsonResponse for large sets of data and we noticed it crashes with memory issues because the json_encode can't be used for streaming data.
What alternatives exist to this issue?
If it can't stream data, why is it called Streamed?
Thank you.
Originally posted by @macropay-solutions in #60252
The text was updated successfully, but these errors were encountered: