Especially on long distance trips, I try to avoid layovers because of the uncertainty it introduces. If I could be guaranteed that my first leg would leave and arrive on time, then I'd happily book two legs with a break in-between over one long flight.
Listening to the live radio communications on a flight was eye opening. Turbulence aside, I can usually feel when the plane is moving to a new altitude (either up or down), and while I always assumed it was planned, being able to hear the process of the pilot asking for clearance to move, getting clearance, and then feeling the plane rise or fall took all of the fear out of it.
You might be interested in TruffleRuby: https://github.com/oracle/truffleruby "A high performance implementation of the Ruby programming language. Built on the GraalVM by Oracle Labs."
I researched this yesterday against Crystal. Seems people won’t touch it because of oracle. Which is a real shame. Truffle is a 9x improvement in speed over Ruby. But because Oracle is a patent nightmare and a dying organization. People aren’t interested.
I went to go look for evidence of oracle dying and actually came up empty handed. I was also under this impression. I think it has waned significantly in popularity in the developer community over the years, but their stock isn’t bad at all and they seem to have diversified quite a bit in the time they’ve existed.
Oracle doesn't even seem that patent happy these days. They seem to have got burned by their experience with the Java patents (all useless, case revolves now around copyright).
Well, JRuby runs on the JVM which is developed by Oracle. Basically, if you're a ruby user who wants good performance there's not many other places to turn.
But you know Oracle acquired Java long ago and other than Google, I'm unaware of Oracle causing problems for any other users. And Google is a rather special case - they reimplemented an incompatible version of Java without licensing it. You're not going to be doing that.
Don't be too jealous. My main concern with Node and its ecosystem is maturity and security. It loses hands down on both those points compared Ruby / Rails especially in the area of dependencies.
Unfortunately Sails is nowhere near as good as Rails. Which is kind of strange given the size of the JavaScript community, and the fact that PHP has Laravel and Python has Django, both of which are comparable to Rails.
Amusingly given all the PHP hate, if you want a faster rails-like framework, Laravel might be your best bet.
I don't know anything about Laravel but I've been working with Rails since 2006 and with Django since 2016. Django has little to do with Rails. It's much more similar to Java Struts from 2005 (when I left it for Rails), form objects and template tags among the other nuisances. No XML thanks god, but a weak deployment story (no Capistrano or Mina, I built my own tool). I'd pick Django over Struts without thinking (I don't do Java anymore, even with morr modern frameworks), but I pick Rails over Django any time customers give me the choice. Django (and Struts) are optimized for large projects at the cost of developer time but very few projects grow even to medium sized. Django has a decent admin tool. That's the only advantage I can think it has over Rails for the typical project I do.
I also worked with Phoenix (Elixir) in the last 12 months. It's kind of midway between Rails and Django in terms of framework and language complexity.
Awesome to see the creator of Sails here! And awesome that you have such a positive and constructive response to criticisms! I see that Sails is also a company and not just an open source project, are you a full time company living off of using your open source work? If so can you give any advice to someone who is trying to learn how to make a living with open source? I am full of energy and passion for software and want to put that towards open source, but it would be much nicer to also get paid while doing so!
Does that mean there's still room in the JS ecosystem for a Rails-alike? Maybe one that could become very popular? Because I am looking for a major open source project to create and spearhead, something that could get hundreds of thousands of active users and a thriving subcommunity, but I've been holding off until I find just the right project.
Personally, having developed in both Rails and Node, I now believe that trying to recreate Rails in JS is a foolish effort. Rails is a huge project with a ton of weight behind it and unless you can generate a massive amount of corporate investment, you won't ever be able to really catch up.
I'm not saying that there's no room in the JS ecosystem for another web framework, what I'm saying is that if your design goal is re-implementing Rails, you're already setting yourself up for failure. Sequelize has tried to be ActiveRecord for how long? The only thing it makes me do is want to go back to Ruby and Rails.
No, what you have to figure out if you want to do a JavaScript web framework is how to get me to actually want to use JavaScript to program a website. How can prototypical inheritance actually contribute to a workflow as opposed to a more conventional inheritance scheme.
But honestly, quite frankly, I can't tell what anyone would want to use JavaScript on the server at all for.
For comparison, (according to `find . -name "*.rb" | xargs wc -l`) Rails back then was about 66K LOC (plus 38K LOC of tests) and is now 137K LOC (plus 216K LOC of tests).
However, Steve says he only ported "essentially all of ActionView, ActionController and Railties, plus a teeny bit of ActiveRecord", so maybe half of the total.
As for the second half of your comment, it sounds like you haven't used JavaScript in awhile. ES6 introduced syntax for classical inheritance, and a bunch of other nice features.
I use ES6 at my job, daily. There are some things about it that make me really hate it. First, no, it doesn't introduce classical inheritance. It introduced syntax that looks like classical inheritance. It's still prototypal inheritance under the hood. I'm not entirely sure what this means yet from the standpoint of building things with it, but I'm not enthused at the prospect.
What I find happens fairly frequently with ES6 that I never found with Vanilla JS is that the extra features built on top of JS 'break'. If you add syntax to a language, that syntax needs to work. I need to be able to rely on it working. I run into constant little issues that make me think I'm missing something about scoping, when really it's some under-the-hood issue with a library or something I just don't have the time to pin down.
One example is I tried using the spread operator to add keys to an object. But the spread simply, well, failed. Passing in the object worked fine. Passing in a spread version of the same object failed. I haven't resolved it yet, got pulled onto another feature. This sort of breakage is hard to Google, when I run into it again, and I'm sure I will, I may have to troubleshoot it all the way to Babel.
Error reporting in Node with ES6 is garbage. Worse than garbage, it's a veritable dumpster fire. Ideally the error points to the problem, when it doesn't point to the problem you have to rely on experience and intuition to lead you to the issue. Many many errors I come across in JS are of this sort.
I think most of these issues come down to the fact that ES6 is a transpiled language. This makes me long for the days of good old CoffeeScript. At least CS was close enough to Vanilla that it was easy to determine when you had an issue with the transpiler, simply grab the snippet of code and paste it into the online transpiler, look at the generated JS and work out your issue from there. It wasn't the smoothest workflow but it was effective.
ES6 is stupid in ways that make building an effective workflow unreasonably difficult. I can't wait to get back to Rails. Vanilla JS wasn't that bad. It was well-understood and you could work with it effectively on the front-end. It certainly wasn't as pleasant as Ruby, but it didn't feel like the pile of hacks that ES6 does. Maybe once it stops being transpiled it'll get better.
Indeed, JS does not have classical inheritance. IMO introducing a class syntax that looks so much like the classes from other languages and yet behaves differently was a mistake.
The rest of your comment sounds a bit misinformed to me. You seem to have decided that prototypical inheritance is somehow inherently worse than classical inheritance, but don't mention why that would be. I have found very little practical difference between the classes in JS and other languages in everyday use, and can't quite imagine what the problems with it might be.
Furthermore, ES6 (ES2015) is not a "transpiled language". I think it's obvious that if you take a very new language, say, ES2018 and want to run it on your toaster that doesn't have support for such new languages, you are going to have to do some kind of precompilation step. That is true for ES2018 today and it will be true for ES2019 next year and for ES2020 the year after that.
ES2015, however, has been around for several years. All the modern browsers (that is, all major browsers except IE) support it already. Node 10 even supports the new module loading syntax (behind a flag), or you can use a very lightweight transformer like esm[1] for older Node versions.
And if you do end up using and having problems with, say, Babel, it would be more constructive to give concrete examples of the issues you've had. I personally have never faced a syntax problem where the issue would've been due to a bug in Babel instead of just my incorrect understanding of the language feature.
I didn't intend to argue that prototypal inheritance was bad, just that I didn't relish the prospect of building something in it, in the context of a discussion about reinventing Rails in Javascript. The argument is that with the amount of time and effort that went into Rails, the new framework has to offer something a lot more unique than just "Rails in JS" if it wants to be relevant, because you'll never even get remotely close to the maturity of Rails.
The problem is that I can't give concrete examples because we're under the gun of a deadline and I can't afford to spend the time to troubleshoot down to root causes rather than just work around the problem and move onto another feature.
I'd love to be able to tell you why the spread operator didn't work in that case. But it didn't, and I made sure to get the whole team around me to tell me I wasn't being crazy. The syntax simply didn't create the needed semantics, and that means that something got messed up in the design of the language. I'm pointing to Babel because that's the only thing I can point to as a root cause.
Rails is nowhere even close to this level of broken. You can rely on the syntax and semantics of Ruby, sometimes gem authors play nasty games with metaprogramming, I saw an example where someone monkey-patched Symbol to get a more declarative method for describing SQL Where Conditions, but at least that crap wasn't in ActiveRecord.
Rails, as a stack, fits together and experience with the framework will allow you to trust it.
Syntax is the foundation of a programming language. If it doesn't work, if it doesn't produce precisely the input that's being described, your language is broken. We're not talking standard library here, we're talking about `{...object}` not being the same as `object`. I don't have time right now to dive into why, but that's the kind of shit I run into when I deal with ES6. When syntax breaks, you can't trust the language anymore. It's a pile of hacks and I wish it had never been invented. Coffeescript was better.
Definitely. And using these technologies, there are some cool possibilities beyond what rails is capable of.
For instance if you used typescript (still allows devs to use plain JS), you could use types to check that forms send all the required params. For example, a login page might require:
Unfortunately TypeScript doesn't do runtime checks like this. It assumes that incoming data conforms to the type spec! However, you might be interested in the Rocket framework (written in Rust), which does do exactly that:
TypeScript can do runtime checks like this! Granted it uses a TypeScript framework (similar to AJV) that both does the runtime checks and preserves type information for your IDE and type checker. I wrote about it here:
Aw, man - I'd hoped you'd found something to make this less of a stone-cold pain to do. Instead, it's io-ts again. Which is...interesting...but it really sucks to actually use when you have to manually define every in/out yourself. It's 2018. There's no good reason TypeScript can't emit the type information to just build these (and I don't really mean decorator metadata).
You can add another build step right after the `tsc -w` compilation phase, which takes your .ts files, parses them to find our your types, and emits runtime code that checks your types. That's what you're describing that TypeScript should be doing on its own. But I don't see why you'd want to do this instead of using io-ts? This is more complicated overall.
Runtime type information is important. TypeScript is in the best place to provide it. So...it should. Boilerplating our way through io-ts is a waste of my time and yours.
been using io-to in my latest project, fantastic bit of code hats off to gcanti...Only thing is IntelliJ can’t keep up with the types generated (got an open bug with them)
You’re probably being downvoted because your comment doesn’t contribute anything to the thread. And rightly so, unless you want HN to be as useless as /r/programming.
Your comment is also borderline rude, as it implies that anyone using JavaScript are unfortunate. I mean, what did you really want to achieve by posting that comment?
>Your comment is also borderline rude, as it implies that anyone using JavaScript are unfortunate
No, it does not, that's just your interpretation. I never used word 'fortunate' or anything similar. I'm happy not to work with it as in "I'm happy not to use public transport today, because I like walking". Surely this can't be rude to anyone except those people who see everything as way to offence them somehow. Inferiority complex maybe? No idea.
> I mean, what did you really want to achieve by posting that comment?
Just wanted to express myself. As it turns out that's a wrong thing to do unless you have some sort of clear goal in mind. Noted.
I would insanely choose ruby rather than javascript for the major advantages that has to offer. Performance in most of cases is not a real needed over consistency. Also Ruby 2.6 has jit, would be great to see some bench tests. :P
2.6’s JIT is entirely at the method level currently, and unfortunately for certain types of workloads (e.g. Rails) this means the performance benefit is largely outweighed by the call counters and deoptimization checks on every method invocation.
There’s definitely room for improvement, and the inclusion of JIT infrastructure is awesome, it’s just not making much of a performance impact. Yet.
The overheads from the call counters and deoptimization checks are tiny. The real problem is that right now MJIT doesn't allow much optimization beyond generating native code equivalent to simply executing the instructions. It's ridiculously simple compared to V8.
I have similar concerns, but I was just on vacation for 2 weeks and would have loved a way to check-in on my cat / cat-sitter to see how things are going.
Are there products that would permit me to do this while still avoiding privacy concerns?
Synology NAS, and some IP cameras you found online somewhere (it is simpler to ask which cameras doesnt Synology support). You’ll have to figure out which external access method works for you (I just use Synology’s dynamic IP mapper). My DS410 is seven years old, so I don’t know if that’s the reason, but remote access can be slow. But it gets the job done.
I remote desktop into my home computers and can turn on the webcams and see what is happening when I am away. Obviously a limited solution in multiple respects, but for just checking in occasionally it works well-enough. There may even be an open source software somewhere that uses your webcam as a security camera and handles archiving, possibly even uploading daily files to a destination of your choosing so you can access while away.
How would the cloud be safer? If you have a personal setup you can control the level of security yourself, and a hacker would only gain access to your cameras, and not the cameras of everyone connected to the cloud. A lot less motivation to hack one person, than thousands.
I personally think Zoneminder is beyond "pretty good".
Honestly, for a casual home setup, it goes well beyond what anyone needs. You can configure it to the n-th degree. It supports pro-level PTZ security cams (some of those cameras are expensive af). It has options for triggering based on hardware interrupts (monitoring serial ports, ethernet packets, whatever you want). It can trigger external alarms itself (with addition hardware). It uses computer vision techniques.
In short - it is basically a professional video security package. There are companies out there that package it up (video security appliances) and sell it as such. Again - waaay overkill for most people's needs.
But it isn't very difficult to get set up and running, provided you have the right hardware. For instance, at home I run it using an old 900 MHz P3 with 512 MB and an old hard drive; it supports a couple of cameras easily - I could probably add one or two more streams and still be ok. Beyond that, though, you'd want to increase the CPU and RAM (something I plan to do is break out an old Core2Duo and repurpose it for this job).
I use cheap IP cameras for my cams, but ZM supports a wide variety of cameras (everything from video capture boards, to webcams, ethernet/wifi IP cams, etc). It can upload captured images and videos to servers of your choice, email them to you, there's a android phone app available...
You can setup a beaglebone/rpi OpenVPN solution in 30 minutes. Then whenever you want to check the camera, connect to the VPN from your phone and launch an app like TinyCamMonitor. This is my setup.
Call (or maybe text) the cat-sitter with a phone? Sorry to be facetious but that seems like the best privacy preserving strategy here and also what I do (for our dog).
No need to apologize - I kind of do that already with texts, but my last vacation was in Japan, so the timezone thing gave little overlap and it would have been nice to have something a bit more asynchronous.
I feel like piracy is going to continue to be an issue until we get a spotify/tidal/apple music-type service for video content. Netflix seemed a lot closer to that several years ago, but it feels like we're moving backwards.
Netflix is so cheap compared to cable, though. I wouldn't mind paying for a few streaming bundles as long as the coverage is wide enough.
I don't know if the industry will move towards consolidation like we've seen in music streaming services, but if prices stay similar to cable but I can watch whatever I want whenever I want, I'd consider that a win.
EDIT: my point doesn't refute that piracy will continue, just that more streaming services isn't too large a regression
Amazon Prime Video is even cheaper. If Amazon manages to increase content while leveraging costs because of their scale and vertical integration, it could be huge.
I have prime video. It really is such a pain in the butt to watch it though. They lock themselves out of chrome cast, Kodi, basically everything I currently use to stream to my TV.
I'm really frustrated with it and don't feel like I'm getting value.
I've never quite gotten why Chromecast is this Rubicon for them. Kindle content is available on pretty much anything that you'd expect, and even Amazon Video itself has officially been on Android for quite some time.
I can't imagine it's some play to increase the all-important value of the Fire Stick, so what's the deal?
Amazon is also a good example of a content provider offering "streaming bundles" - you can get add-on subscriptions for HBO and other premium television providers.
Personally, I don't feel like paying for multiple subscription services, and I suspect I'm not alone. Cable still enables me with options, and includes ESPN and Disney Channel. Sadly, this means I still cannot "cut the cord".
People over at SRD (SubredditDrama) have shown some fairly convincing evidence that /r/the_donald has some pretty serious vote manipulation going on, which probably has to do with it. And really, honestly it seems fairly obvious once you go on /r/the_donald and realize that everything has around +1800 to +3500 upvotes, but with significantly less comments then it should have (Usually in the 100's or less).
All that said, I though that there was a limit on the number of posts in /r/all that could be from any one subreddit - so I think it's likely that the above along with another factor is the cause.
I've been struggling with this a lot. Granted it's pretty much all speculation, but if it is a state actor that is giving Wikileaks this information, and if said state actor was trying to influence the election, I question the responsibility of Wikileaks. It could also be (and I'm aware of this) simply that they're exposing information on "my" candidate, in which case maybe I'd feel differently if was the other way around.
I'm glad this was posted on HN, because I'm looking forward to a (hopefully) more informed discourse than I found on Reddit.
Given that most of the leaks are emails from a single party, when said party was known to use a weak email server, which was known to have been hacked at one point, my guess is that it could be anyone, and doesn't need to be a 'state actor'.
PS. Have you seen the Podesta emails? They range from hilarious to pretty disturbing.
I've poked a bit at the Wikileaks emails. My conclusion is that the email release is designed not to understand the emails in any sort of context but rather to be selectively quote-mined. The use of Courier can obscure the fact that the message is a quadruple-forwarded quote, as well as hide what was highlighted in the original HTML versions of emails. The lack of any attempt to reconstruct threading makes this worse than even the new IETF mailing list archive for trying to read email archives--and that is a bar I didn't think could be cleared.
JSON API seems overly verbose to me. It feels like like a direct descendant of the SOAP / WSDL era -- or maybe I'm just scarred from having written and consumed SOAP webservices.