other large software projects like LLVM and chromium do not have that problem on my system, so it's definitely possible to have a performant build system without overpowering systems
Conversation
Build a production release of Chromium where LTO is enabled for CFI and it uses far more memory for the linking step than anything in AOSP. AOSP has to link far more than a single binary though, so if you choose to use 8 jobs, you'll sometimes end up linking 8 binaries together.
1
Which build system would go out of the way to serialize the linking phase? I'm not sure what that has to do with the chosen build system for Android. If you only have 8GB of memory, you're just not going to be able to run 8 parallel jobs for builds using LTO. Not AOSP-specific.
1
ninja supports only allowing a certain number of a kind of task (i.e. linking) using its pool feature. It is available to be used with llvm. And yes i know how much pain in the butt building chromium with lto is, but android is somehow *worse*
1
1
also i was initially talking about build system performance, not how demanding the build itself is on my pc. If i run the build command and it takes 5 minutes (with build files already generated) to have ninja start the first command it sucks
1
It takes me 40 seconds to do an incremental build of AOSP with no changes, which is still terrible, but at least 10x faster than it used to be before ninja. I save a lot of time having separate devices for testing signed production builds and test key signed development builds.
2
The current bottleneck is largely the incomplete transition away from GNU make. There's still a huge amount of build logic in makefiles which has to be converted by kati to ninja files, which takes a long time and generates very sub-par ninja files compared to blueprint / soong.
2
I can't imagine, as a maintainer, accepting this kind of change without insisting that it be done right before it goes upstream at all (full conversion of logic, no hybrid mess with dynamic conversion).
1
It's being done project-by-project. It's very unrealistic to do the transition for the entire OS in one go, especially since blueprint/soong need to be enhanced to cover all the special things that are needed well. For example, it has a bunch of built-in sanitizer support/logic.
1
I'm frustrated by the remaining Android.mk files but they're steadily disappearing and it's making a fair bit of what I do much easier. I can reliably see exactly where compiler flags / sanitizers are used, which sanitizer blacklists are used, etc. It helps a lot.
2
The way the old system worked was still much more usable for the hardening work I do than typical build systems. It's a unified build system for the whole OS and used a declarative code style, with centralized logic I could edit and reliably make changes I couldn't elsewhere.
I <3 declarative, but not declarative without compartmentalization/encapsulation. Global build system has no reason to know about dependency relationships or generation rules within a single package, just such relationships between packages.
1
The old build system was a way of emulating what they actually wanted, which is implemented by Blueprint. The way it was hacked together with GNU make was very flawed, but I'd take it over most alternatives. Biggest positive is the 1:1 conversion to an actual declarative system.
1
Show replies
It's painful dealing with external projects like apps using gradle, make, cmake, etc. as their build system instead, since I have to apply the changes to each of them in an ad-hoc way. I sometimes have trouble making those builds reproducible and it's fragile / high maintenance.


