Had a few folks balk that we limit our lambda function payloads to 5mb. http://arc.codes encourages per route isolation and that limit means we get sub second coldstarts (usually 200ms).
-
Show this thread
-
Replying to @brianleroux
I'm considering using bundling to ensure coldstart is fast. Only thing is not knowing if one of your deps uses some c++ or something that can't be bundled.
1 reply 0 retweets 0 likes -
Replying to @matthewcp
yes, we did that a bit in the beginning, made debugging a super PITA tho so we ended up adding the 5MB limit and things have been smooth since
1 reply 0 retweets 0 likes -
Replying to @brianleroux
So how do you share code between functions then? I put common stuff in a folder and symlink to it. But this means each function node_module size is inflated beyond what it actually uses. Hard problem to solve i think.
2 replies 0 retweets 0 likes -
Replying to @matthewcp @brianleroux
we bundle using webpack at Bustle and it works very well. Our home grown framework uses it out of the box. If I was starting today I would use rollup
1 reply 0 retweets 2 likes -
Replying to @southpolesteve @brianleroux
That's good to hear. Do you run into issues with some packages not bundling well (maybe they depend on fs.readFile some file in the project folder at runtime)?
2 replies 0 retweets 0 likes
Yes. we've made upstream PRs to fix, marked as external in webpack (and then added back with npm), or just found alternatives.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.