-
Notifications
You must be signed in to change notification settings - Fork 452
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large Memory consumption with attachment #82
Comments
any follow up on this? 12mb attachments are not uncommon |
I need to set the lambda memory limit up to 1GB in order to process 24MB attachments. Any update? |
@drabaioli no this project has been pretty silent since it's release. If you really want to use this I'd brush up on the underlying technologies and make the improvements you need yourself. |
I found a possible alternative (I haven't tried that yet though) : https://github.com/tedder/aws_lambda_ses_forwarder_python3/blob/master/lambda-ses-forwarder-py3.py |
I myself ended up paying for email (zoho) because I don’t want to worry about not receiving an email and don’t have time to learn ses and serverless |
AWS SES has a limit of 10 MB on message size that can be sent. This includes all attachments after encoding, so it effectively limits binary files to around 7.5 MB. See: https://docs.aws.amazon.com/ses/latest/DeveloperGuide/limits.html When the message size is too large, I see this error message in the Cloudwatch Logs:
In my testing, this module handled a message with a 7.4 MB binary attachment (before encoding) with 768 MB memory configuration in Lambda and a 6 MB binary attachment (before encoding) with 512 MB memory configuration. I used the Node.js 8.10 runtime with a 30 second timeout. |
Thanks @arithmetric , I clearly see the limits set by AWS SES. Nonetheless I'm able to forward emails of 24MB. Only problem is that the memory required by the lambda function is over 1GB. |
I'm getting JavaScript heap out of memory errors on a 256mb function attempting to process 12mb of attachments. In fact increasing the size showed that it actually used 576mb....
The text was updated successfully, but these errors were encountered: