"...corporate development teams are uniquely qualified to utterly botch a libc, yet still push it into widespread use..."
That doesn't seem to be uniquely corporate, there are lots of botched, poorly implemented, or designed, community projects not backed or funded by corporations that have come into widespread use.
There are also lots of attempts by corporations to force adoption of some piece of open source software only to fail, look no further than Polymer for example, or NaCL/PNaCL for that matter.
Sounds to me like what they're proposing is a simplified subset of libc, since the exact justifications for what's being supported and what's being left out haven't been discussed, it really seems premature to dismiss it out of hand. There could be plenty of valid reasons for doing it.
Look at OpenGL, a hideously complex spec that is onerous to implement for full compatibility and compliance. John Carmack forked it to create mini-GL drivers that were useful for games. Eventually this became MCD, and eventually the largest most cash flush vendors funded full ICD drivers. Carmack was correct to fork OpenGL for consumer hardware to support games as the fully OpenGL was really incompatible with market needs and backwards compatibility with the full spec was needed by a tiny minority, but held back drivers for millions of consumers.
And now, decades after that decision, we have Vulkan, because honestly, the original OpenGL design needed to be replaced.
Perhaps libc is reaching that point where it is suffering under technical debt?
Yes. Does OpenSSL ring any bells? How about PHP and its ecosystem? How about Sendmail? Or GNOME.
There have been OSS projects 'push'ed into widespread use by the popularity of a few strong personalities and herd mentality, but have questionable design. As someone with a long history of Perl, I'd lump the entire Perl language and ecosystem into that category, along with CGI and FastCGI which existed far longer than they should have before someone put their foot down and made a better open-source application interface.
Developer adoption is by no means a process by which 'he who has the most money wins', there's plenty of examples of big failures to thrust commercial designs into the public space, and there are plenty examples of shitty OSS designs getting uptake, I'm sure if you ask Linus, he could go on a long rant.
As I mentioned with Carmack's mini-GL/MCD model, new forks often get traction if they solve very real customer problems. In Carmack's case, he was developing on NeXT Hardware, for PCs, and PC consumer cards didn't run OpenGL, but often ran either DirectX or a proprietary driver API. If you are in the video game market, multiplatform portability is a given, so forking OpenGL to contain only the high performance subset for games that could run on consumer hardware was so obvious in retrospect, and solved a real program for game developers, who quickly latched onto it, and for 2nd and 3rd tier 3D accelerator vendors who couldn't get devs to use their proprietary API but could benefit from the easy to implement mini-GL.
I have no clue why they want to fork or subset libc, but before dismissing it out of hand, maybe there's a real compelling reason for it, and if there isn't, it'll have as much shelf life as proprietary Unix forks.
But using ad hominem critique in absence of a concrete proposal seems out of place to me.
Its not really ad hominem, Android itself us a massive pile of hacks (zygote process, project treble, binder, etc) that while it worked for google in fixing their issues it is far from ideal. Also what API has replaced FCGI and CGI in general? Hopefully not in process modules.
What's replaced it is running services or application servers that contain an embedded HTTP handler, and putting a load balancer in front of it.
Also, rather than guilt by association (Android et al, which was done under different circumstances compared with the hundreds of other libraries Google has released), why not actually hear out the concrete proposal and then criticize it?
Saying no one in category X can ever build anything good is just illogical.
Have you used OpenSSL? It sucks. The API returns strange and inconsistent error codes. Error reporting was terrible. Strange non orthogonalities abound. It is a lesson on how not to design an API. The only reason it got traction was it was first and no one bothered to do better.
They are great examples of poorly-designed replacements for existing open source software being pushed into widespread adoption solely because the largest Enterprise distribution of Linux willed it.
And also great examples of software developers lying about their intentions early on to silence criticism of their design and the subsequent ramifications of adopting their solutions.
Multiply that by 100 when comparing Google's influence with Redhat's.
If Google wants to develop their own libc, they can. I'm f they want to make it open source, they can. There's nothing about doing those things that necessitates doing so as part of the LLVM project.
Seems to me that trying to upstream is common courtesy before forking. If there is some reason it needs to be integrated with changes to LLVM itself then you might end up with an LLVM fork similar to WebKit/Blink, is that a better outcome?
The point is that no existing libc is part of the LLVM project; i.e., under that umbrella. What's different about Google wanting their own libc and following what Clang and musl do currently?
The question I have is, is there a legitimate reason for tight integration between a compiler and libc? Perhaps there are very good reasons for them to be designed together. A lot of compilers to this for other languages because it allows them to have special logic around intrinsics for example.
That doesn't seem to be uniquely corporate, there are lots of botched, poorly implemented, or designed, community projects not backed or funded by corporations that have come into widespread use.
There are also lots of attempts by corporations to force adoption of some piece of open source software only to fail, look no further than Polymer for example, or NaCL/PNaCL for that matter.
Sounds to me like what they're proposing is a simplified subset of libc, since the exact justifications for what's being supported and what's being left out haven't been discussed, it really seems premature to dismiss it out of hand. There could be plenty of valid reasons for doing it.
Look at OpenGL, a hideously complex spec that is onerous to implement for full compatibility and compliance. John Carmack forked it to create mini-GL drivers that were useful for games. Eventually this became MCD, and eventually the largest most cash flush vendors funded full ICD drivers. Carmack was correct to fork OpenGL for consumer hardware to support games as the fully OpenGL was really incompatible with market needs and backwards compatibility with the full spec was needed by a tiny minority, but held back drivers for millions of consumers.
And now, decades after that decision, we have Vulkan, because honestly, the original OpenGL design needed to be replaced.
Perhaps libc is reaching that point where it is suffering under technical debt?