"Although pkg-config was a huge step forward in comparison to the chaos that had reigned previously, it retains a number of limitations. For one, it targets UNIX-like platforms and is somewhat reliant on the Filesystem Hierarchy Standard. Also, it was created at a time when autotools reigned supreme and, more particularly, when it could reasonably be assumed that everyone was using the same compiler and linker. It handles everything by direct specification of compile flags, which breaks down when multiple compilers with incompatible front-ends come into play and/or in the face of “superseded” features. (For instance, given a project consuming packages “A” and “B”, requiring C++14 and C++11, respectively, pkg-config requires the build tool to translate compile flags back into features in order to know that the consumer should not be built with -std=c++14 ... -std=c++11.)
Specification of link libraries via a combination of -L and -l flags is a problem, as it fails to ensure that consumers find the intended libraries. Not providing a full path to the library also places more work on the build tool (which must attempt to deduce full paths from the link flags) to compute appropriate dependencies in order to re-link targets when their link libraries have changed.
Last, pkg-config is not an ideal solution for large projects consisting of multiple components, as each component needs its own .pc file."
So going down the list:
- FHS assumptions: false, I'm doing this on NixOS and you won't find a more FHS-hostile environment
- autotools era: awesome, software was better then
- breaks with multiple independent compiler frontends that don't treat e.g. `-isystem` in a reasonable way? you can have more than one `.pc` file, people do it all the time, also, what compilers are we talking about here? mingw gcc from 20 years ago?
- `-std=c++11` vs. `-std=c++14`? just about every project big enough to have a GitHub repository has dramatically bigger problems than what amounts to a backwards-compatible point release from a decade ago. we had a `cc` monoculture for a long time, then we had diversity for a while, and it's back to just a couple of compilers that try really hard to understand one another's flags. speaking for myself? in 2025 i think it's good that `gcc` and `clang` are fairly interchangeable.
So yeah, if this was billed as `pkg-config` extensions for embedded, or `pkg-config` extensions for MSVC, sure. But people doing non-gcc, non-clang compatible builds already know they're doing something different, price you pay.
This is the impossible perfect being the enemy of the realistic great with a healthy dose of "industry expertise". Do some conventions on `pkg-config`.
The alternative to sensible builds with working tools we have isn't this catching on, it won't. The alternative is CMake jank in 2035 just like 2015 just like now.
edit: brought to us by KitWare, yeah fuck that. KitWare is why we're in this fucking mess.
"Although pkg-config was a huge step forward in comparison to the chaos that had reigned previously, it retains a number of limitations. For one, it targets UNIX-like platforms and is somewhat reliant on the Filesystem Hierarchy Standard. Also, it was created at a time when autotools reigned supreme and, more particularly, when it could reasonably be assumed that everyone was using the same compiler and linker. It handles everything by direct specification of compile flags, which breaks down when multiple compilers with incompatible front-ends come into play and/or in the face of “superseded” features. (For instance, given a project consuming packages “A” and “B”, requiring C++14 and C++11, respectively, pkg-config requires the build tool to translate compile flags back into features in order to know that the consumer should not be built with -std=c++14 ... -std=c++11.)
Specification of link libraries via a combination of -L and -l flags is a problem, as it fails to ensure that consumers find the intended libraries. Not providing a full path to the library also places more work on the build tool (which must attempt to deduce full paths from the link flags) to compute appropriate dependencies in order to re-link targets when their link libraries have changed.
Last, pkg-config is not an ideal solution for large projects consisting of multiple components, as each component needs its own .pc file."
So going down the list:
- FHS assumptions: false, I'm doing this on NixOS and you won't find a more FHS-hostile environment
- autotools era: awesome, software was better then
- breaks with multiple independent compiler frontends that don't treat e.g. `-isystem` in a reasonable way? you can have more than one `.pc` file, people do it all the time, also, what compilers are we talking about here? mingw gcc from 20 years ago?
- `-std=c++11` vs. `-std=c++14`? just about every project big enough to have a GitHub repository has dramatically bigger problems than what amounts to a backwards-compatible point release from a decade ago. we had a `cc` monoculture for a long time, then we had diversity for a while, and it's back to just a couple of compilers that try really hard to understand one another's flags. speaking for myself? in 2025 i think it's good that `gcc` and `clang` are fairly interchangeable.
So yeah, if this was billed as `pkg-config` extensions for embedded, or `pkg-config` extensions for MSVC, sure. But people doing non-gcc, non-clang compatible builds already know they're doing something different, price you pay.
This is the impossible perfect being the enemy of the realistic great with a healthy dose of "industry expertise". Do some conventions on `pkg-config`.
The alternative to sensible builds with working tools we have isn't this catching on, it won't. The alternative is CMake jank in 2035 just like 2015 just like now.
edit: brought to us by KitWare, yeah fuck that. KitWare is why we're in this fucking mess.