> decades of careful documented research and reproducible results, is magical thinking and not science is... hallucinatory.
Right, the same careful research that helps me characterize personality disorders in my day job. You really should stop doing hobbyist research, because The Psychopath Code proves how badly it can end.
It would be really useful to have an actual concrete argument of how and why, rather than vague "your amateurish book proves how bad your book is" statements.
Edit: ah, checked your comment history, you've been trolling for a while HN. Enjoy yourself, it's a weird hobby but hey. Some people like to cook and make music, and other people... they troll.
Cancer is a horrible disease that I would not wish on anyone, but getting cancer is not a judgement. It won't make you a saint. Nor the reverse. The inevitably flawed person is going to be is much the same as they were before the diagnosis.
I don't know where you got the information that they do not log these requests, but it is a good assumption, not a bad one. It would be atypical not to log every https request.
A lot of setups have one machine doing the SSL and then forwarding the requests over HTTP to backend servers which are logging the requests and would include GET parameters in the log file.
So the idea is to always send the interpreter, along with the data? They should always travel together?
Interesting. But, practically, the interpreter would need to be written in such a way that it works on all target systems. The world isn't set up for that, although it should be.
Hm, I now realize your point about HTML being idiotic. It should be a description, along with instructions for parsing and displaying it (?)
TCP/IP is "written in such a way that it works on all target systems". This partially worked because it was early, partly because it is small and simple, partly because it doesn't try to define structures on the actual messages, but only minimal ones on the "envelopes". And partly because of the "/" which does not force a single theory.
This -- and the Parc PUP "internet" which preceded it and influenced it -- are examples of trying to organize things so that modules can interact universally with minimal assumptions on both sides.
The next step -- of organizing a minimal basis for inter-meanings -- not just internetworking -- was being thought about heavily in the 70s while the communications systems ideas were being worked on, but was quite to the side, and not mature enough to be made part of the apparatus when "Flag Day" happened in 1983.
What is the minimal "stuff" that could be part of the "TCP/IP" apparatus that could allow "meanings" to be sent, not just bits -- and what assumptions need to be made on the receiving end to guarantee the safety of a transmitted meaning?
I don't think it's too late, but it would require fairly large changes in perspective in the general computing community about computing, about scaling, about visions and goals.
Do you have an opinion on text based vs visual programming languages? I think the latter is good for learning, but feel impractical in my day-to-day job. Is there a sweet spot?
Let's give Dan Ingalls the majority of the credit here. I will admit to "seeing" what was possible, but Dan was able to make really great compromises between what Should Be vs what would allow us to make great progress in the early 70s on relatively small machines (give Chuck Thacker the majority of the credit here for similar "art of the possible" with the Parc HW).
I liked the MOP book because they carried the model more deeply into the actual definitions -- I very much liked their metaphor that you want to supply a "region of design and implementation space" rather than a single point.
It doesn't suck "getting old" -- and you only find out about stamina by trying to do things ...
(We are fortunate that most of what is "new" is more like "particular 'news'" rather than actually "new". From the standpoint of actual categorical change, things have been very slow the last 30 years or so.)
But, it's also worth looking at places where there's been enough change of one kind or another to constitute "qualitative". This has certainly happened in many areas of engineering and in science. How about in computering?
Can you explain more about what in UIs you think has gone downhill? I've seen you refer to this idea in quite a few of your comments and it would be great to get insight on what aspects you think need improving/exterminating/rethinking.
But let's see -- how about UIs giving up on UNDO, not allowing or not showing multiple tasks, not having good ways to teach how to us the UI or the apps, ...
I personally vouch that this isn't fake. (But anyone familiar with Alan's work could tell that just from reading his comments here—who could possibly fake these?)
HN has always worked informally. We don't have 'processes'; it isn't clear that we need them, and the thought of having to set them up makes my soul cry. But anyone who has concerns about fakes, abuses, and the like can always get an answer by emailing hn@ycombinator.com. We appreciate such emails because occasionally they point out actual abuses that we need to take care of!
This is assuming the real one wouldn't have articulated anything unique and contextually valuable. (Unless you've got some sure-fire method to determine which statements are objectively so and that's what's getting confused.)
In this kind of cases you can send an emails to the mods hn@ycombinator.com . Most of the times dang replies very soon. (And if the story is a fake, they'd usually remove it from the front page.)