I just looked into the search console for the first time in forever, and I'm not sure how Core Web Vitals would be pushing for unnecessary JavaScript. A website consisting of 100% static content seems to be fully okay according to their metrics - but funnily enough, some months ago they claimed that 50% of the pages were served too slowly on computers, while 100% of pages were OK on mobile. Same static server. Now it's the other way around, they randomly say that a random selection of pages is too slow on mobile, and it changes every day. Maybe it's just their own connection that sucks? It also doesn't help that they keep using acronyms for the broken stuff that aren't explained everywhere on the page itself.
> Maybe it's just their own connection that sucks?
Your Core Web Vitals report in Search Console is based on Chrome User Experience Report data. Meaning that this is data from your real users, not Google running simulated tests of your pages on their own servers. I.e. when someone loads your page from Chrome, Chrome reports back how long it actually took the page to load for that user (it doesn't happen with all users, they have to meet various opt-in criteria [1]). So, if you see that 50% of pages are served too slowly on computers, it means that 50% of your real users actually experienced slow page loads (as measured by the Web Vitals metrics). Perhaps your static site isn't as efficient as you think, or your server is slow, or the devices/connection of your users is much worse than you assumed. That's the power of this data; it shows you that in the real-world the experience isn't as great as you're assuming and encourages you to investigate further.
(For the record) The landing page of the Core Web Vitals report does indicate where the data is coming from. Next to "Source: Chrome UX report" you see a question mark. If you hover over that question mark then click the "Learn more" link it takes you to this page: https://support.google.com/webmasters/answer/9205520?ref_top...
Disclosure: Googler working on https://web.dev. I'm not on the Web Vitals team but interact with them.
It's how the combination of the three encourages you to do things if you want to have a site with rich content. You're supposed to paint the page ASAP, so you don't want to defer loading any large content, but then you're not supposed to have the page layout shift around at all as you dynamically load all that content later, so you've got to do clever things with placeholders and swapping out content and whatnot. You've got up to 4 seconds to load all that stuff, which is enough to load an enormous amount of data over a fast internet connection, so much so that the same amount of content might take minutes to load over a slower connection. Fortunately, they've chosen methods for measuring that metric that are heavily biased toward measuring the experience of people who have 24/7 access to broadband.
So, yeah, Google may want to encourage a nice Web experience, but they don't want to back this with metrics that might discourage people from sending too much business in AdSense's direction, or fail to favor Chrome over alternative browsers.
Whilst there is an element of lab measurement involved, they do use field measurement, so metrics are collated from users rather than their own connection. This means that your data could just as easily be skewed by a browser/OS update that rolls out to a ton of devices at once, as much as anything at your end.
Do agree that the proliferation of acronyms doesn't help with wrapping your head around it all!
Perhaps the mobile loading time tolerances are larger, meaning that if the timings between desktop and mobile are largely the same, they could have different results.