It doesn't generate any ninja files when you start most builds, because it's very rare that you need to do a clean build. It's something done for production releases, but not so much by developers on their workstations, especially now that incremental builds work so much better.
Conversation
it regenerates the ninja build files whenever you change a makefile or add code files (because that requires changing a makefile) or (with android 8 maybe later too) after midnight every day
1
1
other large software projects like LLVM and chromium do not have that problem on my system, so it's definitely possible to have a performant build system without overpowering systems
1
1
Build a production release of Chromium where LTO is enabled for CFI and it uses far more memory for the linking step than anything in AOSP. AOSP has to link far more than a single binary though, so if you choose to use 8 jobs, you'll sometimes end up linking 8 binaries together.
1
Which build system would go out of the way to serialize the linking phase? I'm not sure what that has to do with the chosen build system for Android. If you only have 8GB of memory, you're just not going to be able to run 8 parallel jobs for builds using LTO. Not AOSP-specific.
1
ninja supports only allowing a certain number of a kind of task (i.e. linking) using its pool feature. It is available to be used with llvm. And yes i know how much pain in the butt building chromium with lto is, but android is somehow *worse*
1
1
also i was initially talking about build system performance, not how demanding the build itself is on my pc. If i run the build command and it takes 5 minutes (with build files already generated) to have ninja start the first command it sucks
1
It takes me 40 seconds to do an incremental build of AOSP with no changes, which is still terrible, but at least 10x faster than it used to be before ninja. I save a lot of time having separate devices for testing signed production builds and test key signed development builds.
2
The current bottleneck is largely the incomplete transition away from GNU make. There's still a huge amount of build logic in makefiles which has to be converted by kati to ninja files, which takes a long time and generates very sub-par ninja files compared to blueprint / soong.
2
I can't imagine, as a maintainer, accepting this kind of change without insisting that it be done right before it goes upstream at all (full conversion of logic, no hybrid mess with dynamic conversion).
1
It's being done project-by-project. It's very unrealistic to do the transition for the entire OS in one go, especially since blueprint/soong need to be enhanced to cover all the special things that are needed well. For example, it has a bunch of built-in sanitizer support/logic.
I'm frustrated by the remaining Android.mk files but they're steadily disappearing and it's making a fair bit of what I do much easier. I can reliably see exactly where compiler flags / sanitizers are used, which sanitizer blacklists are used, etc. It helps a lot.
2
The way the old system worked was still much more usable for the hardening work I do than typical build systems. It's a unified build system for the whole OS and used a declarative code style, with centralized logic I could edit and reliably make changes I couldn't elsewhere.
2
Show replies


