-
Notifications
You must be signed in to change notification settings - Fork 914
feat: use proto streams to increase maximum module files payload #18268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
if protobuf.Size(resp) > drpcsdk.MaxMessageSize { | ||
// It is likely the modules that is pushing the message size over the limit. | ||
// Send the modules over a stream of messages instead. | ||
s.Logger.Info(s.Context(), "plan response too large, sending modules as stream", | ||
slog.F("size_bytes", len(complete.ModuleFiles)), | ||
) | ||
dataUp, chunks := proto.BytesToDataUpload(proto.DataUploadType_UPLOAD_TYPE_MODULE_FILES, complete.ModuleFiles) | ||
|
||
complete.ModuleFiles = nil // sent over the stream | ||
complete.ModuleFilesHash = dataUp.DataHash | ||
resp.Type = &proto.Response_Plan{Plan: complete} | ||
|
||
err := s.stream.Send(&proto.Response{Type: &proto.Response_DataUpload{DataUpload: dataUp}}) | ||
if err != nil { | ||
complete.Error = fmt.Sprintf("send data upload: %s", err.Error()) | ||
} else { | ||
for i, chunk := range chunks { | ||
err := s.stream.Send(&proto.Response{Type: &proto.Response_ChunkPiece{ChunkPiece: chunk}}) | ||
if err != nil { | ||
complete.Error = fmt.Sprintf("send data piece upload %d/%d: %s", i, dataUp.Chunks, err.Error()) | ||
break | ||
} | ||
} | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This only triggers if the message size is too large. In my testing, most modules are quite small. So this is actually the edge case.
On dev.coder.com we did hit a module >4mb. We made some changes to get it under the message size, however in the wild, it's quite easy to make a module too large. You could for example have a README.md
with a large uncompressed image.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Out of scope of this PR, but I think this is something we need to remain careful of on the dev registry. Maybe we need some kind of linter that complains if a built module is over the 4MB limit, with some allowlisting for known chonky bois.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree. I would like to build out more infra down the road to keep track of the total payload size of the template. Even more so then just the modules.
if protobuf.Size(resp) > drpcsdk.MaxMessageSize { | ||
// It is likely the modules that is pushing the message size over the limit. | ||
// Send the modules over a stream of messages instead. | ||
s.Logger.Info(s.Context(), "plan response too large, sending modules as stream", | ||
slog.F("size_bytes", len(complete.ModuleFiles)), | ||
) | ||
dataUp, chunks := proto.BytesToDataUpload(proto.DataUploadType_UPLOAD_TYPE_MODULE_FILES, complete.ModuleFiles) | ||
|
||
complete.ModuleFiles = nil // sent over the stream | ||
complete.ModuleFilesHash = dataUp.DataHash | ||
resp.Type = &proto.Response_Plan{Plan: complete} | ||
|
||
err := s.stream.Send(&proto.Response{Type: &proto.Response_DataUpload{DataUpload: dataUp}}) | ||
if err != nil { | ||
complete.Error = fmt.Sprintf("send data upload: %s", err.Error()) | ||
} else { | ||
for i, chunk := range chunks { | ||
err := s.stream.Send(&proto.Response{Type: &proto.Response_ChunkPiece{ChunkPiece: chunk}}) | ||
if err != nil { | ||
complete.Error = fmt.Sprintf("send data piece upload %d/%d: %s", i, dataUp.Chunks, err.Error()) | ||
break | ||
} | ||
} | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Out of scope of this PR, but I think this is something we need to remain careful of on the dev registry. Maybe we need some kind of linter that complains if a built module is over the 4MB limit, with some allowlisting for known chonky bois.
Increase maximum module files payload using protobuf streams
Problem
Coder has a 4MB protobuf message size limit that prevents templates with large Terraform modules from being processed. When dynamic parameters require module files from
~/.terraform/modules
that exceed this limit, template creation fails.Solution
This PR implements protobuf streaming to handle large module files by:
Key Changes
Architecture
Coderd↔️ Provisionerd: Uses new streaming RPC when payload exceeds limits.
coder/provisionerd/proto/provisionerd.proto
Lines 220 to 222 in e880894
Provisionerd↔️ Provisioner: Extends existing stream with new message types for in-memory communication
This change enables templates with arbitrarily large Terraform modules while maintaining performance through deduplication and preserving the existing fast path for smaller payloads.
Version to 1.7
Old versions (1.6) are compatible with new Coderd, as the previous uploading still works if the module_files are <4mb.
Large modules are omitted in v1.6, and templates throw a warning that it is missing modules on the Create Workspace page.
Version 1.7 is not backwards compatible. A 1.7 provisioner will send messages a previous Coderd does not support.
Manual QA
Old provisioner with a large module
The modules just are not uploaded, which is the expected behavior.