> The web is supposed to degrade gracefully if you are missing browser features, up to and including images turned off.
I agree. (This should also include CSS, TLS, cookies, and many other things; I often disable CSS, and it should work just as well if CSS is disabled just as much as if pictures or JavaScripts or cookies are disabled.)
However, there are some uses where JavaScripts may be helpful e.g. if a web page has a calculations or something like that implemented by JavaScripts; but that is not an excuse to prevent the documentation from being displayed if JavaScripts are disabled. They should really make documentation and as much other stuff to work even if JavaScripts are disabled (and, depending on what it does, may provide a description of the calculation or of the rules of the game being implemented, or a link to API documentation, or something else like that).
Pictures also might be useful in some articles (but are often overused); but even then, if the picture is not displayed you could use an external program to display them. However, if it can be explained in the text, then it should be explained in the text if possible so that even if you do not have a suitable program to display that picture (or do not want to display that picture, e.g. the file size is too big; or maybe you are using text to speech or a braille display or something else like that) then it will still work.
TLS also should not always be mandatory, either. For things that require user authentication, and for writing, it can be useful to be mandatory (especially if you are using X.509 client authentication; this will be better than using cookies or other methods for authentication, anyways); but for read-only access to public data, TLS should be optional (but the server should still allow it in case the client wants to use TLS for read-only access to public data too).
No, TLS should always be required. We should be moving away from plaintext anything on the wire, if for no other reason than privacy (but also for integrity, injecting scripts into an HTTP response is a nasty attack surface).
It would have to be some kind of web of trust thing. But no, I don't have any specific suggestions to that effect (which is why I believe that HTTP should remain an option).
A lot of people argue about this both for and against.
However, TLS does not entirely help all of these things, especially with the way it is commonly used with HTTPS (although it can be used in a more secure way, it usually isn't). Although it prevents spies from injecting scripts, it does not prevent the server operator from modifying them from what the user expects (and it is not necessarily the software the user intends to run), nor does it help against someone taking over the domain name and then legitimately putting something else there instead (and stealing cookies, etc; X.509 client authentication would prevent theft of authentication though). To verify that a file is not modified, cryptographic hashes will help much better (whether or not TLS is used; TLS still prevents spies from seeing which files you are looking for though, and prevents spies from obtaining a copy of a file that is intended to be secret). However, these are not inherently problems with TLS; they are problems with the implementation, problems with HTML, problems with HTTPS, and problems with the expectations.
Another issue with TLS is proxies; if you want to deliberately run a proxy on your own computer (so that it is protected from spies, etc), you will have to decrypt and encrypt the data twice. This is also a problem with the implementation, and not an inherent problem with TLS itself; an implementation could easily allow unencrypted proxies of encrypted connections, but doesn't.
Self-signed certificates are commonly used with Gemini protocol and can be used with other "small web" protocols that support TLS. If you know the server's certificate from some other source (TOFU is a common way, but you could have another way), then self-signed will work, although it should be possible for the end-user to specify to require a specific server certificate (and adding this information into the URL would make it possible to pass a URL that includes this information, which may sometimes be helpful).
Web of trust for X.509 certificates is something that I had thought of, and I had some ideas to make up a file format for that purpose. This will be a DER file which has a digital signature, and specifies hashes (and possibly other details) of certificates that you are aware of and will contain details about what parts of the certificate you trust and in what ways you trust them (e.g. you can say that you trust the common name, or that you understand a specific extension but have been unable to verify it, etc), as well as optional comments. Note that this can be used whether or not the certificate is self-signed; certificate authorities can still be used too (which still have uses, e.g. if the authority is partially being delegated).
Another problem with certificates is securely superseding them (especially self-signed certificates). X.509 does not allow to add an extra field of unsigned extensions after the signature (and an extension inside of the certificate clearly cannot sign the certificate itself, since it would interfere) (although an extra field could be added, implementations that do not understand it may reject the extra field), so this could be done in two other ways instead. One is to add an extension into the certificate specifying the superseding key (for security, this can be different than the certificate's own key, and the corresponding private key may be stored on a separate computer that is not connected to the internet and may also be passworded for additional security), and optionally links to sources of the superseding file; the superseding file is then a DER file that lists the certificates being superseded and what they are superseded by. This means that even if the keys (or other details in the certificate, such as names, or expiry dates) are changed, trusting someone else is not necessary, and no other third-party authority is needed to verify if it is actually the same person/organization, or if it is someone else who has taken over the domain name, or if spies are changing the certificate, etc.
Web of trust can be used together with superseding certificates.
DANE would also be a good thing to use, although it won't help by itself, for many reasons. For example, in case the domain name is taken over by someone else or if the DNS is not itself secure (or whoever owns the DNS tampers with it). However, it can be used in combination with other methods in order to improve security.
However, I think that allowing unencrypted connections is just simpler anyways; if the user does not need or want this privacy and security compared with wasting extra computing power or whatever, or wants to use older software that does not support this version of TLS, or is connecting only to other programs on the same computer (which will have its own security mechanisms, making TLS unnecessary (except for testing purposes)), or for whatever other reason, then allowing unencrypted connections is helpful, especially for read-only public data. Integrity can often be verified better in ways other than TLS anyways, as I had mentioned. I think there are many reasons why unencrypted connections should remain an option where possible; however, servers and clients should be made to allow TLS where possible, too, in case you do want this security, but it should not normally be mandatory (especially for read-only public data).
I agree. (This should also include CSS, TLS, cookies, and many other things; I often disable CSS, and it should work just as well if CSS is disabled just as much as if pictures or JavaScripts or cookies are disabled.)
However, there are some uses where JavaScripts may be helpful e.g. if a web page has a calculations or something like that implemented by JavaScripts; but that is not an excuse to prevent the documentation from being displayed if JavaScripts are disabled. They should really make documentation and as much other stuff to work even if JavaScripts are disabled (and, depending on what it does, may provide a description of the calculation or of the rules of the game being implemented, or a link to API documentation, or something else like that).
Pictures also might be useful in some articles (but are often overused); but even then, if the picture is not displayed you could use an external program to display them. However, if it can be explained in the text, then it should be explained in the text if possible so that even if you do not have a suitable program to display that picture (or do not want to display that picture, e.g. the file size is too big; or maybe you are using text to speech or a braille display or something else like that) then it will still work.
TLS also should not always be mandatory, either. For things that require user authentication, and for writing, it can be useful to be mandatory (especially if you are using X.509 client authentication; this will be better than using cookies or other methods for authentication, anyways); but for read-only access to public data, TLS should be optional (but the server should still allow it in case the client wants to use TLS for read-only access to public data too).