Hacker Newsnew | past | comments | ask | show | jobs | submit | frigo_1337's commentslogin

One thing from Clean Code that fundamentally changed the way I write code was the bit about how “Comments Lie”.

I hadn’t really thought about it much before. But since then, I see lying comments absolutely everywhere. Many of which I have written myself. The author is right, comments are really hard to maintain.

Nowadays I try my absolute hardest to avoid writing comments. 9 out of 10 times, it can be solved with some refactoring.


Comments are really useful to explain reasoning behind complex sections, or edge cases that caused code to be written. Other than that I hate boilerplate comments, and try to make code self documenting.


Thinking code can be "self-documenting" is one of the biggest delusions of this field.


You’re being downvoted but it’s extremely true.

Code should strive to be as self-documenting as possible but there will always need to be some level of documentation- and good comments are part of that.

One other part of that effort should be good unit, integration, and acceptance, tests.


I'm not a huge fan of the UI (images are nice, but only for about half of the usual HN content), but I am very impressed by the performance of the app.

The speed at which images and content are loaded (esp. during paging) makes it feel very snappy to use. I don't know why, but it "feels" even faster than the standard client.


I'm getting tired of these pointless non-specific rants. You could write an article about "How it feels to learn C++ in 2016" and make it all about the complexities of operating systems, linking, compilation, text editors and the QWERTY keyboard layout, and you'd still be as accurate.

All you need to "JavaScript in 2016" (as a beginner) is a config file and one or two commands. That's it. If that's too information or too hipstery for your taste, then follow the footsteps of other programming languages and use an IDE with a button that can hide that complexity for you.


> All you need to "JavaScript in 2016" (as a beginner) is a config file and one or two commands

And 3 months from now in 2017 those 2 commands will be deprecated because no one uses those programs anymore.


As far as beginners are concerned, that shouldn't really matter.

Hide all that complexity in a `./build.sh` or `make` and hand them a config and a README. The author of the article is a web designer with a slightly technical problem to solve. He didn't even need to know about gulp, or grunt, or webpack or babel. Those tools are (should be) as relevant to his domain as the tools used to manufacture the circuits that run them.


Not everyone is a beginner web designer. Some people are trying to build things from scratch using JS.

"Just use this magic build.sh!" sounds a lot like "Just use this magic starter kit!" or "Just use yeoman!"

Who is supposed to write this 'build.sh' in this scenario of yours?

Without fail, every single magic build system or magic starter project I have ever used is now deprecated and abandoned.

  Makefiles? No one uses those anymore, use Grunt!
  Grunt?  No one uses that anymore, use Gulp!
  Gulp?  No one uses that anymore, use webpack!
I have a react app that I built on top of a starter kit that I now need to rebuild using create-react-app because the build process broke when I tried updating something.

Meanwhile, I can pick up a python project I worked on 10 years ago, or a go project I started 4 years ago and everything works exactly the same.


The number of libraries is too high, but the bigger issue is the speed at which each one is replaced. I thought the ruby ecosystem was bad 10 years ago, and the python ecosystem bad 5 years ago, but these are like sloths on Vicodin compared to the absurd rate at which the JS community seems to adopt and deprecate libraries and frameworks and whatever else they're calling them today.

The worst thing is, every single thing these frameworks are trying to address are solved problems, and have been for decades. But people love reinventing wheels.


I don't think there are any stable ones, even Jetbrains' IDEs bang their head with all the JS tools' complexity nowadays.


When "sprinkling Javascript" turns into "making sauce for the spaghetti you've been cooking", it's probably time to consider React or something similar.


What do you mean?

Somebody wanted to write code in a particular way and extended the language with a couple of utilities for doing so, then decided to publish it for others to use.

I don't see the problem. Could you elaborate?


Honest question: Why?

I don't understand why people inherently dislike Javascript (aside from, y'know, creepy ad networks).


Because we want the information to be free. If the web server is serving a document, you can do all sorts of stuff with it - you can index it, you can transform it, you can save it for later.

If the web server is serving a DRM-ed program, that loads the human-viewable data over non-standard interfaces, all that breaks. Only humans in front of the web browser will be able to see the data.

Or sufficiently dedicated people to run a headless browser to run the Javascript and re-build the content and work on the rebuilt DOM. But also we now need NoScript, JS blockers, 3rd party blockers, and the publishers invest in anti-adblocks. It's a neverending arms escalation between those who want to restrictively publish information, and those that want the information without restrictions. So all this JS-based workaround to try to DRM things only brings more work for everybody involved, with minimal results.

NOTE: when I say DRM, I actually mean Digital Policy Enforcement - the publishers want to maintain their policies around access to their information (e.g. you cannot see this article without seeing this ad) using digital means. But DRM has a nicer twist to it - the uninformed may mistake the R for My Rights.


I like the term DRP (Digital Revenue Protection)

Seems to cover the intent of things quite nicely :)


RMS always refers to DRM as Digital Restrictions Management.

Also conveys the intent quite nicely - plus you can keep the acronym ;-)


I thought he used it to refer to delicious ripped foot manifolds?

http://youtube.com/watch?v=I25UeVXrEHQ


Because, besides the creepy ad networks and stuff, it's most often used in a mix of shitty engineering and user-hostile practices.

Let's consider a web document like this article here. Its stated goal is to be read by the visitor and thus deliver him value. So presumably, an article that's easier to read is better than one harder to read. An article that, ceteris paribus, consumes less resources on user end is better than one that consumes more.

Now we have a perfect technology to deliver that article. Plain old HTML. With a little bit of CSS on top. When what you want to send is text communicating a message, you need exactly zero JavaScript to do that successfully[0]. You barely even need much CSS - the default browser styles, raw as they are, are better than most web designers produce, if you care about providing value to the user.

Now if you don't, here starts JavaScript. Look at just what JS on Wired does and find me one line of code that actually serves the user. The JS there tracks you, shows you ads, shows you nagging popups[1], adds social media buttons that are somewhat useful if you want to exchange being tracked everywhere for convenience of not having to CTRL+TAB to that Facebook tab. In general, JS here is a waste of electricity (often in users' phone batteries).

You can run a similar analysis of other websites[2] and rarely if ever you'll find one when JavaScript does anything other than fuck users over more or less subtly. The technology is fine, but everyone[3] is using it for user-hostile purposes, and/or because of bad engineering. Think of all the scroll hijacking, JS rendering article text dynamically on a blog page, etc. Personally I dislike it from the very same reason I dislike crappy code.

--

[0] - Sure, JS can be used to qualitatively enhance the reading experience, to make it more pleasant and efficient. I accept that in principle, but I'll cede the point only when I see anyone other than Bret Victor actually doing it.

[1] - Wired, I appreciate that you wanted to say "thank you" to me for turning off uMatrix for a second, but could you please do not do that with a popup?

[2] - Web apps are a different topic; I don't think anybody is saying you should turn off JavaScript for GMail or Google Docs. But most of the sites on the web are not, and should not behave like web apps.

[3] - Except Bret Victor.


I once was smart enough to burst into a rant about exactly this in an interview question.

I didn't get the job.

Next I was joking with a friend that all the layers of abstraction added to the web are probably part of some big conspiracy by web developers to create artificial demand and job security. You run a heavy CMS but the bells and whistles confuse rather than help the user, and they'd rather pay for an hour of hour time than figure things out themselves. Your spa makes a simple series of documents feel like an app, with all the added complexity, but in the end the user couldn't care less about the full page transitions or parallax scrolling.

I have one simple rule: if it's about information retrieval, it's supposed to be a simple bloody document. If it's supposed to act like an app, it should look like an app. The latter is the propper use case for JS, but people are applying the latter to the former. This is not user centric design.


Agreed.

JS on a website that is otherwise not an app has its use cases ie. tabulated data, search, filtering, realtime data etc.

But a static website displaying a simple article has no fucking business running any code other than HTML/CSS on my computer.


Reddit's mobile site takes 1-2 seconds to load comments through JavaScript, and heaven forbid you click a link; that destroys your scroll position. Disable JavaScript and comments load instantly and the back button actually works.

JavaScript seemingly is more often used to degrade basic site functionality, rather than enhance it.


Because js-heavy sites are slow on a bit older pc's that are otherwise perfectly capable for most of the other tasks.

See idlewords.com/talks/website_obesity.htm


Speed and bandwidth usage are part of it. I visit a site, wanting to see specific content. The vast majority of the 44 domains (according to uBlock Origin) that Wired connects to aren't actually needed to render what I'm trying to look at. They're mostly ads, tracking scripts, analytics. None of which help me and most of which are never used to help shape the content, only monetise user data and serve adverts.


I recently found d a stash of web pages I had saved locally over a decade ago. They still load...a modern one wouldn't.


You should create screenshots instead.


Those don't scale with resolution changes.

A saved webpage can stay as responsive as an ebook.


Screenshots are also bigger than necessary (image instead of text + metadata), and are not greppable.


Why? So I can fuck around with OCR later?

The data is already structured.


...but why do you want to access it by index?

Since you refuse to be more specific, it sounds like you're just not willing to learn the language and the way problems are solved with it.

Expecting mutable C-type arrays in elixir/erlang is like expecting classes and lambdas in an assembly language. It's just not a good fit for what the language was designed to do.


not the OP, but I've had a side project in Elixir and often I wanted to browse/play with some of the last saved models in repl. When I get the collection using `ecto`'s `Repo.all..` I can't just `bets[4]` or `bets[-2]`, I have to do weird gymnastics like `hd(tl(bets |> Enum.reverse))`



thank you!


I really like the ideas behind Matrix. Sure, the existing (open) clients aren't all that sexy, and synapse can be a bit slow sometimes. But implementing your own client is a breeze and bridging other networks through application services makes it incredibly powerful.

My experience with the matrix.org community has been very pleasant so far. Keep up the good work, folks!


The "positive" things about the issue/pull request etc. is presumably already explained by the author.

"Issue 1234: We need this feature because reasons A, B and C."

If you don't think that A, B and C justifies the change, send a downvote and/or explain why A, B and C aren't good enough reasons.

An upvote is sufficient to say that "Yes, I agree with A, B and C. Do it."

Of course, if you agree with the change but for reasons other than A, B or C, you can leave a comment to start a discussion. It has nothing to do with culture. It's just a way to minimize the redundant "Yes, I agree with what you just said".


I don't see anything about Ethernet/Wi-Fi. Am I missing something?

I understand they had to cut some features to reduce the prize. But networking is such a fundamental requirement for these types of systems, you'd think that it would be the last feature to be excluded.


Can the HDMI port be abused for some kind of networking?

There are a whole bunch of extra features tucked away inside HDMI. I know the Pi doesn't support Ethernet-over-HDMI, because it's missing the necessary PHY, but apparently it does support I2C-over-HDMI. I2C isn't very fast (tops out at 3.4Mbit, if you're lucky); is there anything else which could be used hidden inside somewhere?

One interesting thing about I2C, though, is that it would be cheap and easy to build a chassis with a whole bunch of these $5 modules wired together via I2C. It's almost certainly not cost effective in any way, but would be interesting, particularly if you could get useful work out of it...


The networking on the other Pis is attached via USB. It's a very easy and obvious thing to drop.


Yeah I guess.

Still, I for one would be more than willing to pay a few extra bucks for a stripped-down version of the Pi with wireless included.

Actually, all I'm looking for in this type of machine is Linux, Wi-Fi and GPIO, in a reasonably compact format. Maybe the Pi isn't a good fit, since it is more about "teaching computing" than "building your IoT stuff".


Exactly. If you need Ethernet, a USB-to-Ethernet adapter is $1.50 on AliExpress including free shipping from China.


And one kind person suggested PPP over serial in comments to the Make article: http://elinux.org/RPi_Serial_Connection#Network_connection_w...


I set up ppp between a pair of RPis a few years ago - you can crank up the speed quite a bit past 115200 baud.


I agree, if this would include an ethernet port it would be the cheapest embedded linux board with wired networking. I would really like such a device and am kinda sad it doesn't exist. Like a ESP8266 but wired.



There's a USB port.


Plug a wipi into it via the USB OTG cable. Done.


No. Wireless adapter needs better power source, so it's RPi to powered hub to WiFi, which makes the setup quite a bit bulkier and pricier.

Like gp I wish some sort of network connectivity can be included to make it more complete as a standalone computer.


Nope, we have hundreds of Pi2's with two wipis in them, yeah it's fine.


You don't need a hub, you need a smartphone charger (one that gives you more than 500mA at output).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: