Skip to content

Request guidance/feature to stream streaming tool responses back #886

@weihaoxia-01

Description

@weihaoxia-01

Is your feature request related to a problem? Please describe.
Is it possible to stream streaming tool responses back?
If the tool sends back streaming chunks like

const res = await client.fetch(...)
for await (const chunk of res.body) {...}

is it possible to stream the chunks back immediately?

Describe the solution you'd like
Please provide a guidance on how to do this kind of streaming, or implement a new feature for that.

Describe alternatives you've considered
https://modelcontextprotocol.io/specification/2025-03-26/basic/utilities/progress provides a notification mechanism, but not all clients are using that.

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions