Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sorry. I consider CGI to be sort of a “failed experiment of the past,” sort of like Server-Side Includes (shtml). What CGI does, is let anything that can write to a pipe, publish on the Web (you could write a web app with Bash).

I think that anything that will interact with remote browsers, should go through a Webserver. Many utilities actually have their own built-in webservers (not sure that’s always a good thing).



But why is it a failed experiment ?

Because the simplicity results in various footguns, just like most programmers probably shouldn't use C these days ?

("Webserver" sounds like an extra layer on top, like how Python allows you to (mostly) forget about memory handling issues ?)

Is it also because simplicity either doesn't add performance here, or is it just because the performance is rarely needed ?


Let me rephrase that: It’s not “failed,” per se, but it is inaccessible.

It’s what systems programmers would like, because we don’t like things that get in the way. However, most folks aren’t systems programmers. They want padding and safety nets.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: