Don’t focus too much on the packaging format. Container images are an industry standard for packing bits in a tarball vs Lambda’s use of zip files and layers.https://twitter.com/marknca/status/1116010610687344641 …
-
Show this thread
-
Replying to @kelseyhightower
One big difference to keep in mind is that if you bring your own container you are responsible for maintaining and packaging the runtime layer of the image. One thing I like about Lambda is that the runtime is patched and maintained by AWS. I just provide the code layer
5 replies 0 retweets 11 likes -
Replying to @nathankpeck
If you take Lambda’s Go implementation you are in the same boat as my “from starch” Docker image holding a single binary. In both cases the runtime is compiled in.
3 replies 1 retweet 13 likes -
Replying to @kelseyhightower @nathankpeck
Go is unique is being both compiled and statically linked, I think it’s a red herring in this discussion.
2 replies 0 retweets 2 likes -
I’m not sure—compiled + static linking seem to be increasingly be the norm, even in Python and Typescript (and in my opinion, _should_ be the norm).
2 replies 0 retweets 0 likes -
Replying to @endsofthreads @thramp and
I think the future of serverless is the exact opposite direction. Right now the most interesting platform I've seen is Fastly Lucet: https://www.fastly.com/blog/announcing-lucet-fastly-native-webassembly-compiler-runtime … Long story short it compiles Rust, TypeScript, C, and C++ to WASM which executes in a V8 sandbox, only 5ms of overhead
6 replies 1 retweet 2 likes
Not a V8 sandbox. This is a new runtime, written from scratch in Rust, specifically designed for concurrency and low latency. It’s open source, and included in Lucet.
-
-
Yep, sorry I got confused in the tail end of that tweet between Lucet and Cloudflare Workers! Lucet has much lower overhead and the concept of compiling from other languages to WASM is very cool!
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.