-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reduce provider binary on-disk size #4383
Comments
Some information. Most of the size is present in the upstream provider build:
|
From https://github.com/t0yv0/gobuildsize report on the upstream provider, major contributing packages are:
|
@t0yv0 additional feedback from the customer about the impact of the plugin binary file size growth:
|
This makes sense Ringo, thanks for that detail. With 50GB of plugins, I am wondering if something can be done at the plugin cache level, some form of scheduled eviction, as it appears multiple copies of provider(s) are involved there. |
A bit of context there. This grew over the course of roughly half a year while maintaining 50+ pulumi stacks each with its own repo with its own dependencies which are regularly updated in a semi automated way. |
I think there's a feature request in pulumi/pulumi that could be helpful in a situation like this: pulumi/pulumi#7505 I will cross-link and add some ideas there. We would love to reduce the binary size of pulumi-aws but since it appears to be dominated by |
Consider looking at options to reduce provider on-disk size.
Per a customer comment: it grow from 5.16 (~400MB) to 6.49 (~800MB) unpacked on disk.
Benefits of a leaner on-disk provider:
Possible culprits here:
embedded schema.json including more resources and examples for said resources, in more languages such as Java; could it be feasible to strip descriptions or at least examples from the schema distributed in the binary?
more embedded provider metadata, is there any compression that can be applied?
more Go dependencies statically linked in, anything that is possible to prune?
The text was updated successfully, but these errors were encountered: