Hacker News new | past | comments | ask | show | jobs | submit login
Atom now using Io.js (github.com/atom)
223 points by skyllo on Feb 3, 2015 | hide | past | favorite | 130 comments



For those of you who like me who aren't sure what this is about: Atom - text editor from Github, Io.js - node.js fork.


Ah, thanks, I forgot that was the name of the fork and was definitely puzzled.


This isn't particularly surprising, NW.js (previously called node-webkit) switched to IO.js as well.

Edit: to clarify, this is relevant because both Atom and NW.js use a webkit shell.


Oddly enough Atom-Shell has apparently not made the switch to IO.js? No mention of it in the releases page.

https://github.com/atom/atom-shell/releases


atom-shell absolutely has made the switch to io.js, that's where Atom's node comes from.


Huh, I just downloaded the latest atom-shell and ran a hello-world that dumps the "node" version and it says v1.0.0-pre. That sounds like IO.js alright.

Why then would they still refer to "Node" in the release notes, all the way up to the most recent version??

"Fix initializing node integration in the webview when the page in it navigates."[1]

https://github.com/atom/atom-shell/releases


That's just a typo, it indeed does refer to io.js integration in WebView tags.


It might be about a DOM node, not the Node project.


To be pedantic, Atom Shell and NW.js are both based on Chromium (Blink), not WebKit


Until they fix the 2MB limitation on editing files. No way.


Moore's Law should apply, and eventually processing power should catch up. I think we'll see the 2MB limit removed by 1997/1998.


> Moore's Law should apply, and eventually processing power should catch up. I think we'll see the 2MB limit removed by 1997/1998.

That's a nice summary on the quality on web technologies.

2009: "Hey guys, we can now draw on a widget; we called it Canvas!"

2011: "Wow, look at this cool OpenGL wrapper that locks up your system in ways VoodooFX didn't manage to do in 1998! But you can use it on your web site, which is what people want, right, guys?"


Yeah, I find this stuff depressing. I see people jumping up and down over a demo of the Unreal-1 engine running at 0.04 fps in javascript and I think to myself, if my old compaq POS from the late nineties could run that engine without trouble, why can't my brand new Macbook do better, or at least the same?

It feels like we've redefined "progress" to mean "the same thing as 20 years ago, but worse"


> the same thing as 20 years ago, but worse

More like the same thing as 20 years ago, but the dominant language is a LIS.. I mean a Schem... I mean it's a bit like Self.. look, it has garbage collection, a JIT, supports objects and goes somewhat fast. Oh, and it's used to drive a GUI made up of stringly-typed ... things that kind of looks like SGML and has no chance in hell of achieving good draw performance. But you can calculate fibonacci in CSS and get even worse performance than doing it at compile-time with C++ templates!


I think you would enjoy Alan Kay's talks, especially "the PC revolution hasn't happened yet".


That is why I enjoy so much mobile has taken the native back and pushed the web into micro-services, which is what should have been all along.


WebGL puts in heroic amounts of effort (and many performance compromises) into working around 3D driver bugs and trying to bring about a memory-safe 3D programming environment that still leverages 3D acceleration. Blame the system lockups on buggy drivers, and systemic problems in GPU software stacks, not "web technologies".


Or syntax highlighting breaking because some code you've been given has some 8K-long regex on one line in the middle.


Regular expressions are not the way to do syntax highlighting yet many of the popular editors do it this way. Create a super huge string in vim, textmate or sublime text and watch it blow up. Then try it in Xcode or Intellij and see that there is no problem. The reason for Xcode and Intellij's fast syntax highlighting is because these do not use regular expressions, they use a lexer. They're examining the text one character at a time with some amount of forward and backward looking state and building the syntax highlighting and smart completion/features that way. As long as your editor is using regular expressions for syntax highlighting this will be a problem for you.


I believe cordite is saying that Atom's syntax highlighting fails when your code contains a huge regular expression.

I don't believe cordite is claiming that Atom's syntax highlighting fails because it uses regular expressions internally.

(I don't know if Atom's syntax highlighting even works that way, FYI. Though I would guess that it does.)


This is the correct interpretation of what I said.

I don't know if it is also related to long strings in general on one line (have not tested)


Lexing IS regular expressipn matching (except for some esoteric features like Haskell's nrsted comments).

Difference is in the handling.

As a matter of fact, Visual Studio uses semantic analysis - classes show differently than other identifiers.


It can be, it can also be a context free grammar, which is 'higher' than a state machine (or regular expression)


But there are also higher complexity bounds on CFGs (since they include regular expressions, they certainly can't be lower).

In any case, I've never heard of someone using non-regular expression CFGs for lexing. Are there cases?


I mentioned Haskell above. It has nested block comments which should be balanced. Thus CFG.


I'm so happy I don't work on the files you work on.


I like that my editor can edit hex and it doesn't barf if I give it a giant file. But yea, it's only maybe once a year I open a file that big.


Ever need to peek inside a browserify'd bundle with inline source maps to check transformations (e.g. 6to5)? Those can easily be over 2MB.


Seriously. 2MB files have to be doing it wrong. Right?


In addition to log files (covered in another reply)...

- Database dumps (if you're dumping a database to a text file, how often will it be under 2MB?)

- Data, in general. Tab/CSV text files are still the lingua franca of non-hierarchical data and that will probably never change.

- Code you don't control or haven't refactored yet. OK, we can all agree that a 2MB source code file is probably "doing it wrong." But maybe you don't control it. And even if you do control it... well, you need to edit it to refactor it, right?

That said, I don't think the 2MB limit is a total death knell for an editor. I keep TextWrangler (OSX) and UltraEdit (Win) around precisely because they're good with larger files. If my primary code editor is good at large text files, that's a bonus, but it's not a total necessity.


Crazy, but not out of the question. I use Sublime to edit conflicts in .xcodproj files. These brutally detailed XML files - necessary to iOS/OS X development - can be huge. At my previous job one was 7mb.


Definitely situational but not out of the question. Maybe I want to use my favorite text editor to examine a logfile. 2mb is nothing when it comes to production logs.


So the thing about Atom (and any "disruptive" software) is that it doesn't have to be amazing at everything. It only has to be 10x better for some niche of people who will be this product's early adopters and evangelists.

That will gain it enough traction to prove itself in the market, and over the next few years it will fix all the problems such as the 2MB limit, and whatever else is there. And then, laggards (well, you're probably more of an "early majority" type of person in Crossing the Chasm lingo) will see enough value to join.


It needs to be amazing at something though.

And well... it isn't. The biggest thing I can tell is that it's "hip". Meanwhile Sublime Text spins around it at 40MPH doing donuts.

I tried. I really tried to use Atom. But the text editor is just so slow and unresponsive, and that is a flat out dealbreaker for a text editor. Its extensibility isn't any better than Sublime's, and I feel like Python's a way better language than Javascript, so I'm happier tooling against Sublime.

I too wish Sublime Text was open source, coming from an open source development background, but I understand exactly how hard it is to develop open source software and feed and clothe yourself - people only buy "support contracts" for the first year and rarely renew, and only for software they feel is technically complex enough that they might actually need help or customizations.

For the record, I also backed GNOME Builder (which I also already find superior to Atom) in the hopes of some day it becoming the text editor and development environment I lust after. But my greatest fear is that more time will be spent on the shiny features and not enough time on the things I really care about like a decent GUI debugger so I don't have to live with poking at GDB, ahead-of-build compiler warnings/errors, and code refactoring tools.


I was really interested in Atom but I found it to be a little sluggish (particularly when I switched it to the background) and sometimes unstable on my Macbook.


Perhaps your computer might be older. Its blazingly fast on my machine. as people move to newer machines then speed is not going to be an issue.

Compared to Sublime, Atom developers can build much richer interfaces.

my own supercollider IDE in Atom:

https://atom.io/packages/supercollider

this is a full repl with a debugging call stack

haskell IDE:

https://atom.io/packages/ide-haskell

preview your development work in a web pane:

https://atom.io/packages/mobile-preview

but yet I think the packages are still young and there is much more that can be done. its early days.


The trouble is that an editor shouldn't require a 2010+ MacBook Air in order to be blazing fast. As I recall, we were running syntax-highlighting editors on ~166 MHz Pentium machines in the mid 90s. Hell, maybe I even ran one on my old 80486 laptop.


I have a 2014 Macbook Pro and it's sluggish there. I have half a dozen Linux machines around me daily and it's entirely unusable on them. You can visibly tell when the GC is running, it's that terrible.


Time to launch various editors on my SSD-backed, 4x4ghz Windows desktop:

ST3: Instantanous, VS2013: 1 second, Atom: 3 seconds

You know you've got problems when Visual Studio is running rings around you.


You just described a load of things that it's amazing at:

- it's hip (dont discount that, its important. it excites people, and gives contributors a reaso to contribute)

- it's in JS - maybe not right for you, but there are lots of people who have the opposite opinion.

- it's open source: for some, a closed source editor is a deal breaker. Imagine the people who say "i hate emacs and vim; I'd love to use sublime but it's closed source" - they'll run to atom.

- its got corporate backing (gnome builder may die or wither, there are many who can believe that atom wont because of GitHub's backing)

So those are some pretty big niches right there!


None of those things are actually features.


Doesnt matter. They're reasons to use it. (I'd argue being in JS is a big feature for many, but since it doesn't matter, nothing to discuss :)


Sublime seems to be back on track. In the last week it received two updates :)


The biggest issue with Sublime is that it's been unchanged for over a year. I think that's the big problem that Atom is solving - Sublime + active development.


Sublime devs released a build today:

http://www.sublimetext.com/forum/viewtopic.php?f=2&t=17509

The product is alive but understandably slow with a single developer.


Single developer shouldn't be that big of a factor in slow development and complete media silence for six months to a year on end; any half-decent developer can churn out an update every week, whether it's a new feature or just a bugfix. That's how you keep people excited about your program, and how to prevent it from seeming stale and dead.


What is it that you would like to see fixed in or added to Sublime Text?


I don't use it; I just know that this is the reason many of my coworkers are moving away from it. I've heard concerns about it breaking on a future OS upgrade, or just failing to keep up with others in features.


https://twitter.com/sublimehq Development will speed up in 2015. 2 releases in a week 'stable' enough for you?


Is my Twitter screwed? No tweets since Dec 2013.


Except it HAS changed. http://www.sublimetext.com/3dev


If you're looking to lose any and all good will you've established, charging all previous customers $70 to get updates is a great way to do it. ST2 dev is dead, which is what matters to the people who already paid for the product.


The 2MB/slowness isn't something that can just be "coded away". It's a direct result of the technology stack chosen. It is not a solvable problem. Therefore Atom cannot succeed beyond niche usage.


"not a solvable problem" and "cannot succeed" are very strong assertions that are not backed up by evidence.


Why is there a 2MB limit anyway? Is it because the document is rendered to HTML? In this case, why don't they render only the part currently visible on screen?


https://github.com/atom/atom/pull/1607/files

The editor is intentionally crippled.


They were probably tired of people complaining about problems with this corner case of opening humongous files in an editor meant for editing source code. Atom properly should be optimized for kilobyte sized files, not megabyte.


Machines these days have Giga Bytes of RAM so handling a file only a few megabytes in size should hardly be an issue.

I'm the author of the Zeus editor and I regularly use an old 1.25 Pentium with 512 MBytes of RAM to test the performance of the editor.

Even running on that 15 year old machine Zeus stays fully responsive while editing a 60+ MByte file.

For a modern day tool a 2 Mega file limit seems rather low.


Atom is essentially a web browser running a text editing web app. Given how much web pages tend to "expand" when loaded into memory in a browser, it's not so surprising.

I'm guessing that Zeus isn't based on a web browser but was designed from the beginning to be an editor.


> Zeus isn't based on a web browser but was designed from the beginning to be an editor.

But Atom was also supposed to be an editor...


Zeus, like a lot of text editors is an in memory editor, which basically means it loads the whole file into memory.

So it too is limited by the total memory available to the application.

For a 32 bit Windows application the available memory comes in at around 2 Gigs and for a 64 bit Windows application that explodes to 8 Terra bytes.

So even if Atom was 32 bit it should have 2 Gigs of memory to play with.

I'm not sure why Atom does limit files to 2 megs, but if it is because of the expansion of memory then there must be some serious expanding happening there.


Why do all threads about Atom seem to be full of people opening huge files with text editors? I've never had performance issues with Atom with a 2013 MBA.


The C++ project I'm working on at the moment has many dozens of 1Mb header files. Its very frequent for source files to go past the two megabytes mark. Splitting these files is no option either; there are over three thousands of them and compilation times are slow enough already (many hours without distributed build systems).

As for data files, we have XMLs (yeah, I know..) over 50Mb in size.

Atom is a long way from supporting such projects. This probably doesn't apply to most web projects, but 2Mb is far from a huge file :)


>As for data files, we have XMLs (yeah, I know..) over 50Mb in size.

SOAP dependency?


No, just data definitions used by the application. Most of it uses generated parsers and serializers but we still need to manually edit the files now and then.


Even my ~10 kb files syntax highlighting & scrolling lags on a computer where I play Battlefield 4 at almost highest settings.


Right. I think this is the point that people are really making.

Right now, Atom is just too slow when compared with other editors such as Sublime. Given enough time it shouldn't be an issue (maybe), but right now I think it's a huge pain point for a lot of devs.


I'm not sure I can think of any editor that's as laggy as atom.

In all fairness though: I wouldn't notice 99% of the time, and I can well imagine that this problem will disappear with time.


Emacs' performance can be pretty damn bad sometimes, but I don't use Atom enough to know their relative positions.


I just switched back to emacs from testing atom for a few weeks, and my biggest issue with it was the speed in opening files from a closed client. emacs deals with it by daemonizing emacs-server, atom deals with it by trying to be performant, but failing. In this day and age, i'd rather sink some cheap memory and keep the daemon running.

If atom has a daemon feature that I never found, or one upcoming, I'd consider it again. Until then my workflow involves too many client closes for me to not notice the significant delay that atom has when opening a new file.


I feel like daemonizing Emacs should be the default. I can happily add random stuff to my init.el which causes it to take a little while to cold start, but I never see a cold start so from my perspective Emacs is instantaneous for editing a random file. I keep hundreds of files open at all times and never have an issue with performance. I just go through and prune my open buffers once a month or so, for organizational reasons more than resource usage.


It depends on your usage. Sure, source files are (hopefully) not going near that size, but many people regularly open multiple-MB log files or data files.

Also, for me at least, it's the principle of the thing -- we are talking about a measly few MB of text; if you can't even open that, IMO something has gone horribly wrong.


Why are you opening log files with a text editor?


If you are looking for unusual conditions, scanning through visually is sometimes easier than it would be to try to grep for the particulars of an entry, especially if you don't know the particulars of the unusual condition yet.

Some editors with "minimap" functionality (Sublime Text and Emacs that I'm aware of, maybe others) can give you a fairly good visual overview of large swaths of a file at once. Sometimes strange entries are obvious just because of the shape of the line and in those cases the minimap makes it easy to see when something unusual is happening.


Please enlighten us.


Today I learned that in 2015 a 2MB text file is "huge."


How often do you find yourself editing a 2MB file of code? If the answer is "often", I'd strongly suggest refactoring.


Never code, but data, all the time. I write scrapers and such that produce MBs worth of output text. For quick hacks, large text files are great. I even drop large binaries into my text editor just to search for a string sometimes (easier than opening up a cli to strings | grep). At my job I frequently work with XML files measuring 100+MB, I didn't make that design choice, but I gotta work on it now.


OK! Let's refactor this 2MB file!

Wait... we kind of need a text editor that can open the file, don't we?


I frequently find myself sifting through dozens or hundreds of megabytes worth of data. Sublime handles it just fine, FWIW.


I often use ST to analyse some log files. Do you have any other tooling suggestions besides the command line with grep/awk/... ?


Not really, no. I can't say I've ever used a text editor to search a huge log file in that way.


It is rather large to be editing by hand.


99% of the files I open are less than 2MB, but every once in a while I have a large file to open. I think this is typical.

VisualStudio handles them instantly, but even Sublime Text seems to take a bit of time "processing" it.


Because no one wants to use a laggy text-editor, it doesn't matter how sleek and hip it is if they can't fix that.


Why do all threads about Go seem to be full of people who cannot live without generics?

Snark aside, feature X is rarely absolutely needed (you can replace your generics by some boilerplate; you can use grep instead of your text editor to process huge files), but in some cases it can make your work significantly easier and that's why people want it.

In my own case, opening video files in Sublime and having the hex and ASCII side-by-side made my job significantly easier when I was writing a parser. I could easily select portions of the data (e.g. an embedded metadata), annotate it with comments from the spec, then paste it in a unit test.

I have Atom installed, but see no reason to migrating since Sublime works well enough. My only concern is that the Sublime package ecosystem will likely start rotting now that Sublime itself is essentially abandonware, so at some point I might need e.g. autocompletion for Typescript and the best package will be on Atom, not Sublime.


Atom chugs even with small files, at least when compared to Sublime.


I see this limitation mentioned often when Atom is discussed.

On the constructive side of things, anybody have recommendations for editors that do handle large text files well?

Obviously vi and emacs do. (Correct?)

OSX - TextWrangler and its big brother BBEdit are the best I've found - but this is not something I've extensively researched.

Windows - UltraEdit boasts that it can edit files of arbitrarily large sizes and in my experience this is true. UltraEdit costs money but I have an old version I keep around for when I need to work with bigger files.


On the mac BBEdit is incredible. A several hundred MB log file will bring Sublime Text to it's knees, and BBEdit has no problems at all. +1


I haven't looked at it for long, but as I remember it could help if there was a visualized for those files. Often it's a png, a jar, it would be less frustrating if it could open anyway, just in the right editor?


What do you define as a large file?

For example Atom seems to think a 2 MByte file is large.

I personally think 100 MBytes files are quite large.

But some users with massive log files consider Giga Byte files as being large.


Your definition sounds fine to me!


Sublime Text. I've used it for files > 2GB and it seems to handle them fine.


vi, nano, micro emacs, vim, emacs, gedit, geany, Kwrite, Kate, notepad++, sam, acme


2MB limitation should be handled when 1.0 is released, meanwhile view-tail-large-files package might help.


you can write loops or recursive functions to make your code smaller.


This is good news but I'm a little confused, if Io.js supports ES6 why do you need 6to5?


ES6 support is not a binary thing. IO.js has a newer version of V8 that supports more ES6 features than node.js's V8 version, but its not even close to all of ES6. to 6to5 shims the rest of the ES6 featureset. see here for more info: http://kangax.github.io/compat-table/es6/


This doesn't explain it, why do you need to compile ES6 at run-time? Plugin authors should compile before distributing.

This would annoy me as a user, I don't care that you really want the await/async feature. ES5 is not that bad. It's not like the difference between Lua and Vimscript. Just use ES5 or compile your ES6 prior to distributing.


You compile ES6 at run-time in developer mode - ES6 compiles are cached and installing a plugin precompiles it on your machine automatically. However, if you're developing a plugin, ES6 will be compiled on-the-fly.

The summary is, It Just Works™, efficiently.


Cool, I retract my complaint then.


Exactly.

To be fair you could (and I have...) say the same thing about coffee script.

Just let the plugin authors write in whatever-the-heck they want and compile it to js, and distribute as js, like everyone else.

On the bright side, this shows that the github guys are thinking about things other than coffee script at last. Thank goodness.


if i read this correctly, even 6to5 supports 77% of ES6, which is interesting.

too bad its taking so long, i cant wait until full support comes to browsers


yes, I believe most of the missing things are things that cannot be shimmed, such as typed arrays.


With projects like 6to5, you don't have to wait.


Thank you for this! I was a little confused.


What does this mean for Atom? Is it faster? Is the compiled size now smaller?


Most likely it means that there's a newer V8 involved. Otherwise there's not a tremendous delta.


This is bigger news for io.js than it is for atom. <<< I think is the lede.


Yes. It is arguably larger news for io.js.

However, that io.js ships with v8 4.1 is quite a large step ahead for dependants, not just for language features, but also under-the-hood stability and performance.

Of course, there is also everything that node 0.11.x has, plus a release schedule to match Atom's own release cadence. :)


io.js also ships with the node.js active core developers...


This is interesting, and a very big business move to make the switch. Does this say something for the future of Node?


I though the whole point of io.js was that they would work on the project faster than Node, but eventually merge back into Node once Joyent became more transparent. The Node release cycle was pretty slow and they had some dispute with the contributors (top 5 original Node contributors now working on io.js)


I would not think the merge back is going to happen. The longer they separate, the more unlikely (and apparently harder) for them to merge


Why would the io.js team bother merging their changes "upstream" to Node? io.js has momentum and mindshare.


They publicly stated they will merge back in upstream eventually all being well. They're doing it for the good of Node.js, not to create a rift and divide the community's efforts.


IIRC the Node Forward initiative was more explicitly focussed on merging back than io.js is. The subtext of io.js seems to be "we'll move forward with or without Node moving foward with us".


Yup. Also because Joyent asked that Node Forward not use the "node" term, even though it was in public domain a lot longet than Joyent's been around.


plus 6to5 support! :)


Any idea if that would mean auto compilation for your own files in a project? Or is it more for plugins and such?



I'm curious as to how they will handle 6to5 versioning. 6to5 seems to be very much in a "move fast and break things" stage, and I'm not sure that's going to change any time soon.


I was just wondering today when this might happen. specifically because I want to use generators in a plugin (supercollider ide). great job guys !


90% of my work is on remote files by SFTP/SSH. How's that working in Atom?


You do have have few packages which helps you connect remotely.


FUSE / sshfs?


Who cares? Is going to be slow an unusable non the less.


Why would I ever use a text editor that uses even node.js, let alone some even newer thing? I don't understand how people can commit to such a thing.


You might use it if you are familiar with javascript as many people are. It makes creating packages for many people possible.


Because it’s a text editor, not a child. If you don’t like it you can close it and the world will keep on turning.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: