"If your main concern is old computers the solution is to use a proxy that strips the encryption."
Why not also use the proxy to add encryption when it is missing. Then "HTTPS-only" is not necessary. The decision of which scheme to use, http:// or https://, rests with the user, not the server.
Encrypting data after it has transited untrustworthy networks (which could have surveilled or modified it before it gets to you) is about as useful as closing the barn door after the horse has already escaped. The encryption (and authentication) needs to happen at the origin to get any security benefit.
I think you misunderstood. A localhost-bound forward proxy on the client side encrypts the traffic. For example, haproxy can be used for this purpose. It detects the presence/absence of SSL on connection and if absent it adds encryption before the traffic enters the network. Sorry I should have been more specific as "proxy" is a loaded term.
I must still not understand what you mean. Where is the encryption being added? Could you draw me an ASCII network diagram (showing the server, the browser, and the intermediate network hops) with an indication?
It is being added by the proxy server listening on the loopback which connects to the remote website.
Browser connects to forward proxy on port 80, forward proxy (compiled with SSL library) connects to target IP on port 443.
This is how one can, e.g, use clients that are not SSL-enabled to access websites, etc. that require SSL.
For example, if forward proxy is listening on 127.0.0.1:80, we can make an encrypted connection to example.com using original *hobbit netcat which does not support SSL.
OK, now I understand what you meant about "a forward proxy on the client side" (as that's exactly what I mean by "use a proxy to strip the encryption"). But I still don't understand why that allows you to not have to use HTTPS-only on the originating server to get the benefits of HTTPS-only?
Because I the user am running a forward proxy to encrypt all outgoing HTTP requests, I do not have to rely on "HTTPS-only" on the server side. I enforce "HTTP-everywhere" on the client side. That's the theory anyway.
To be honest there are still some sites that do not, and will probably never, offer HTTPS and I have to account for those with the proxy setup. For these websites I might assign them a different local IP that does not add encryption.
In running this setup there are some times where I find that for one reason or another "HTTPS-only" on the server side has failed to catch every instance where http:// should be https://. I use many different clients, the least of which is the modern browser which may have some whizbang features to try to enforce "HTTPS-everywhere". The clients I use more are simpler, less complex and do not have such features. Instead of relying on the modern browser, I rely on an extensive proxy configuration to make sure everything gets encrypted (when appropriate).
Why 1.3? Most sites today are using TLSv1.2 at best, not 1.3.
One argument it is/was "security theater"[1] is that there were/are numerous programs that "added support for SSL/TLS" under pressure to conform but did not bother to add authenticaton measures such as hostname checking and validation, which was not even a function of the OpenSSL library until 2015.
1. Adopting SSL/TLS for accessibility reasons instead of security reasons.
Yeah but there's nothing that says the Xcode experience has to be so painful[1] or that people have to pay Apple or that software has to be approved by Apple in order to preserve this flexibility. Purchasers should have options, including the option to write software (for themselves) that works on the computer they paid for but might not work if Apple introduces computers based on a new architecture, which the existing Apple customer may or may not choose to buy in the future.
1. Painful for the parent commenter. It is painful for me, too. It might, or might not, be painful for others.
I haven't used Xcode much, so it's not for me to say if it's good or bad. However, I would assume that Apple uses Xcode internally, if it was a pain to use, in general, would Apple fix it? At least to save on development cost internally.
Of cause it could be that Xcode is very finely tuned to Apples needs, which may be radically different from those of 3. party developers.
They do use it internally and a lot of teams do complain about it. I hope there is some real work going on to improve some parts of it (how about shipping patches rather than an entire 11gb binary every version). I think a lot of devs also pin certain issues on Xcode that are actually issues with the Swift compiler (slow or missing autocomplete, unhelpful error messages or slow compilation times for example) ObjC is far far quicker to compile.
Apple uses Xcode internally very heavily, and it's less about it being tuned to Apple's needs (which it is, but as an app developer your needs and Apple's match pretty closely) but that Xcode tends to not scale very well.
XCode (with swift) seems to crash on me regularly just with normal usage. The last time I used it occasionally it would lag so badly that it sometimes lost characters I was typing, and my code would come out completely garbled (what!?). A few weeks ago I tried to do a local build of Signal's iOS app and the swift compiler errored out on some random function. Apparently it spent so long doing type inference that it decided to just generate a compilation error instead. I guess I'm not wealthy enough to even compile swift code slowly. (For reference, this on a $3000 (AUD) macbook pro from 2016 with 16 gigs ram, the latest xcode and nothing else open).
Back in the objective-C days I used to really like XCode and loved interface builder. The documentation and ecosystem seemed top notch. But in the last few years its gotten much, much worse. Its like the old guard have left and a young generation have come in to develop the swift ecosystem - and they just don't have good judgement yet. The documentation is halfbaked at best. The whole environment is an overcomplicated, buggy, laggy mess. Its a pity too, because I think Swift is a lovely language and I love what they're doing with SwiftUI.
In comparison I find microsoft's tooling around their languages to be excellent. Visual studio + vs code are fast, stable, feature rich and responsive. Particularly with microsoft's own languages: C# and typescript.
Xcode with Swift is a bit odd in that its performance varies greatly depending on project structure and as nonintuitive as it may be, the way the developer writes code.
For instance a sprawling project with an even split of Objective-C and Swift that constantly call into each other can bring SourceKit to its knees, as can some CocoaPods setups. Going Swift-only or even just reducing the surface between Objective-C and Swift to the absolute bare minimum can speed things up a lot.
As for writing style, there's a few things that can trip SourceKit up but I've found that the main things to avoid are chaining optionals too deeply and nesting closures too deeply (recommend keeping both to 2-3 tops) as well as casting to/from Any too often.
Not quite - the origin of "You're holding it wrong" was a response to someone who was quite literally trying to illustrate the cellular signal attenuation issue by pressing down with their fingers on the gap between the two antennas.
I believe the parent was saying that the performance of xcode varies based on the project. This is true - breaking projects up into more modular setups has long optimized project build speeds. The quality you get out of your tools is in many cases based on how you use them.
"You're holding it wrong" would apply if you were deliberately using the tools in a sub-optimal way to try to prove a point.
Yeah its 2021. There's no excuse for shipping software you know crashes every few hours of normal use. Or lags so badly you can't type sometimes. I doubt XCode in its current form would make it through Apple's own app store review process. Fix your software or replace your staff with people who can.
I choose to abandon programming for Apple's devices entirely rather than put up with xcode.
I’ve noticed similar issues, it’s especially a problem with SwiftUI as nesting closures often feels natural there, my brand new M1 mini will still give me the “expression too complex” error on larger SwiftUI components.
Would the hobbyist projects whose development is effectively inhibited by Xcode and the Apple Developer Program, i.e., the hoops Apple makes programmers jump through, be more interesting than the apps people pay developers to create.
"Page loading" is a lie. The truth is artifical delay has been intentionally inserted. You are waititng for ad auctions, e.g., header bidding, to conclude. About 1s.
"For years, they said no, because they were worried about the liability of accidentally blocking something that wasn't a phishing site."
Can anyone explain how a web browser author could be liable for using a blacklist. Once past the disclaimer in uppercase that precedes every software install, past the Public Suffix (White)List that browsers include, how do you successfully sue the author of a software program, a web browser, for having a dommainname blacklist. Spamhaus was once ordered to pay $11 million for blacklisting some spammers, but that did not involve a contractual relationship, e.g., a software license, between the spammers and Spamhaus.
I think the situation is actually exactly like the Spamhaus case you describe: it wouldn't be the browser user that sues, but the blocked website's owner. The website's owner need not have accepted any kind of agreement from the browser maker in order to be harmed by the blocklist.
Perhaps the website would sue the author of the list.
That does not explain why this comment suggests a browser author was afraid to use the list.
The browser author could easily require the list author to agree that the browser author has no obligations to the list author if the list author gets sued by a website, and the list author must idemnify the browser author if the browser author is named in any suit over the list. The list author must assume all the risk.
"In the case of Munn v. Illinois in 1876, the Supreme Court upheld the right of government to regulate industries that affect the common good or public interest (in this case, private grain storage facilities that all local farmers depended upon). Drawing on centuries of British common law precedents that justified regulations of privately owned ferries, wharves, and the like, the majority of the court declared that "when private property is devoted to a public use, it is subject to public regulation.""
Is the "private property devoted to public use" a million-page website comprised of 100% UGC.
Is it the telco infrastructure that provides the backbone of the internet.
Do we the public really need the help of middlemen or do we only need the backbone infrastructure.
Although 100% UGC, some consider the million-page websites "private".
Well, now we have politicians and other characters, who are using these million-page private website middlemen as agents in their efforts to persuade voters to vote them in, re-elect them, raid a capitol, and so on.
The middlemen are now caught in the middle of a political dogfight. Perhaps that is the price of being a middleman.
Interesting that he has set the threshhold at 1 million users. How many websites are over that threshhold. Would this incentivise more small websites. Are small websites better than large ones.
They are considering something similar (or not) in Poland:
Why not also use the proxy to add encryption when it is missing. Then "HTTPS-only" is not necessary. The decision of which scheme to use, http:// or https://, rests with the user, not the server.