One watch out here is patterns that encourage developers to do the wrong thing.
With golang, for example, it's common to see applications that don't set reasonable cache control headers, content type and length headers, handle head requests properly, compression, and so on.
It's not that golang can't do those things, but the libraries, documentation, and examples don't encourage it. So you see a lot of minimal http implementations that "work", but not well. It's like every (less experienced) end user of the library has to learn all the basics by trial, error, bug reports, etc.
I think this is mostly a layering issue. If everyone that just wants to scrape HTML or talk to an API is using the basic HTTP API, then you've got this problem. That's where Python was about ten years ago - using the stdlib's urllib, it was very easy to talk HTTP, but you'd probably do it badly.
Then requests came along, aimed to make it easy for anyone to consume HTTP, with sensible defaults -- and it rapidly became the defacto standard way to get things done. Hopefully the rust ecosystem will have similar libraries, built on top of this new 'http' base.
This particular announcement is about defining shared API types between libraries for which (I think) the things you've listed would be a concern. It's not a full HTTP implementation AFICT.
The point is that pushing this as a concern for all implementing libraries is dangerous and likely not to end well. (At least, that is how I took the point.)
I'd argue that all but the most extremely basic of HTTP options are far from universal, and defining sane defaults is the responsibility of frameworks and not a core library.
As I understand it, this http library is meant to represent a 'low-level' representation of a HTTP request, and so I think it's right that it doesn't force lots of default HTTP headers or settings.
HTTP requests rarely live in isolation, you are probably going to be making several requests to one or more servers, so most language libraries have another layer on top of the raw HTTP objects. (e.g. in Perl, there's LWP::UserAgent, among others). Here is where the network code tends to exist, for instance. It's more appropriate to put some defaults at this layer - for example, keep-alives and connection re-use are vital concepts for efficient HTTP communication but apply across multiple requests. You can't sensibly control them at the base 'this is a single HTTP request' object layer.
That doesn't rule out good examples though. And some of these are close to universal, maybe not in set value, but at least setting them to some value, versus omitting them.
Agreed. And in particular, these kinds of types would make it much easier to write a framework-agnostic library that helps you handle caching correctly, built on top of these types, like Request and Response.
I'd argue that at the API level, these things should be required to think about. Any level of "help" so that I don't have to worry about cache headers should be from a framework that tells me what they think about them.
> I'd argue that at the API level, these things should be required to think about. Any level of "help" so that I don't have to worry about cache headers should be from a framework that tells me what they think about them.
Agreed completely. The lowest-level API should expose them and every other bit of the standard, and higher-level APIs should handle them automatically in sensible ways.
>> defining sane defaults is the responsibility of frameworks and not a core library
But...but.. Golang core team teaches us that "framework" is a 4 letter word and a core library is enough for everybody. No need to overcomplicate with extra abstractions, just use the standard library they say.
Not sure if that's a jab at my post. Not trying to be controversial. But, it doesn't appear that the example code shown for returning a json response even sets the Content-type header to application/json.
Since there's obviously some time lag before frameworks that use this core library will appear, good documentation on this sort of thing seems advisable.
That's not a jab at your post. Look at Swift and the way they encourage creation of new frameworks and server applications [1].
Then look at the hostile atmosphere in the Go community and tweets of its core team members stating that "http" package is great for everything, no need to use anything else. The framework (or as they prefer to call it toolkit) authors are throwing shit at each other accusing their opponents of creating something not "in the spirit of go" / unconventional, usefulness for the end users is not taken into account. Some others are creating posts about how everybody should stop creating frameworks immediately because that makes them uncomfortable. People hesitating to open-source their code because they are afraid to become a victim of crusade. That's not exactly healthy athmosphere, and it is nourished by core team members.
With golang, for example, it's common to see applications that don't set reasonable cache control headers, content type and length headers, handle head requests properly, compression, and so on.
It's not that golang can't do those things, but the libraries, documentation, and examples don't encourage it. So you see a lot of minimal http implementations that "work", but not well. It's like every (less experienced) end user of the library has to learn all the basics by trial, error, bug reports, etc.