Js dependencies should be pretty small compared to images or other resources. Http pipelining should make it fast to load them from your server with the rest
The only advantage to using one of those cdn-hosted versions is that it might help with browser caching
> Http pipelining should make it fast to load them from your server with the rest
That's true, but it should be emphasized that it's only fast if you bundle your dependencies, too.
Browsers and web developers haven't been able to find a way to eliminate a ~1ms/request penalty for each JS file, even if the files are coming out of the local cache.
If you're making five requests, that's fine, but if you're making even 100 requests for 10 dependencies and their dependencies, there's a 100ms incentive to do at least a bundle that concatenates your JS.
And once you've added a bundle step, you're a few minutes away from adding a bundler that minifies, which often saves 30% or more, which is usually way more than you probably saved from just concatenating.
> The only advantage to using one of those cdn-hosted versions is that it might help with browser caching
And that is not true. Browsers have separate caches for separate sites for privacy reasons. (Before that, sites could track you from site to site by seeing how long it took to load certain files from your cache, even if you'd disabled cookies and other tracking.)
There is still a caching effect of the CDN for your servers, even if there isn't for the end user: if the CDN serves the file then your server does not have to.
Large CDNs with endpoints in multiple locations internationally also give the advantage of reducing latency: if your static content comes from the PoP closest to me (likely London, <20ms away where I'm currently sat, ~13 on FTTC at home⁰, ~10 at work) that could be quite a saving if your server is otherwise hundreds of ms away (~300ms for Tokyo, 150 for LA, 80 for New York). Unless you have caching set to be very aggressive dynamic content still needs to come from your server, but even then a high-tech CDN can² reduce the latency of the TCP connection handshake and¹ TLS handshake by reusing an already open connection between the CDN and the backing server(s) to pipeline new requests.
This may not be at all important for many well-designed sites, or sites where latency otherwise matters little enough that a few hundred ms a couple of times here or there isn't really going to particularly bother the user, but could be a significant benefit to many bad setups and even a few well-designed ones.
--------
[0] York. The real one. The best one. The one with history and culture. None of that “New” York rebranded New Amsterdam nonsense!
[1] if using HTTPS and you trust the CDN to re-encrypt, or HTTP and have the CDN add HTTPS, neither of which I wouldn't recommend as it is exactly an MitM situation, but both are often done
[2] assuming the CDN also manages your DNS for the whole site, or just a subdomain for the static resources, so the end user sees the benefit of the CDNs anycast DNS arrangement.
Js dependencies should be pretty small compared to images or other resources. Http pipelining should make it fast to load them from your server with the rest
The only advantage to using one of those cdn-hosted versions is that it might help with browser caching