I wish people stopped calling a shared config a 'distribution'
In both Spacemacs and Spacevim, all that's being provided is a config. No runtime, no package manager, no changes to the actual binaries of either app.
Using and customizing SpaceMacs is considerably different from using plain emacs.
They use their own config system, their own plugin system (aka layers), and the user experience is very different, right from the start.
Of course it's emacs underneath, and you can still do much as with plain emacs, but it's different enough for it to be more than "just a config for emacs".
It's just a config for emacs. Sure it has some functions and life-cycle hooks for you to change a few things here and there, but in the end it's a glorified config. You delete .emacs.d and you're back to default emacs.
In emacs, thanks to Elisp, you can't really distinguish between a config and a plugin, because most 'configs' use actual code to (often extensively) alter the editors behaviour.
It's a well thought out config for Emacs. A layer is just a list of use package statements with some config to make it work with the leader concept and vim bindings if that's set.
You can still install Melpa packages quite easily and you can still bind keys however you want. It's still Emacs.
Is there a clearer more understandable word? Your definition of 'distribution' sounds like something much more specific (changes to binaries?) than mine.
At risk of putting words in the another poster's mouth, I don't think the idea was that a distribution needs to have changes to binaries, just that it has to include the whole thing. I think the notion is that since you don't get your build of Emacs as part of Spacemacs, Spacemacs isn't a distribution of Emacs. I tend to agree. I think it could be a distribution of Emacs even if it were just upstream Emacs plus a simple conf file, but the key is that it has to include the thing it is a distribution of.
Thank you for filling in the blanks. You pretty much summed up my definition.
I can only guess at the reason why Spacemacs team chooses this term, but there is one simple problem with it.
Users of Spacemacs are encouraged (by this naming quirk) to treat it as somehow essentially different to Emacs. That's wrong.
It would be of benefit to all for the code that comprises Spacemacs (layers mainly) to be released as packages. It would place greater emphasis on Emacs as the actual core product, and allow people to use and collaborate on smaller chunks of Spacemacs, without the monolith problem.
Not familliar with SpaceVim (yet) so I can only speak for Spacemacs.
Emacs has really bad support for writing modular code (~/.emacs/init.el), and Spacemacs adresses this issue. Also a lot of work went into integrating various packages and modernizing emacs experience. To give an example, Spacemacs out of the box had 90% of features that my custom-crafted ~2kloc config file did. And I find the (new) shortcuts so intuitive that I am considering adding them to my vimrc.
> No runtime,
Spacemacs loads your config (very fast I might add), adds a nice startup screen, gathers statistics about your config etc.
> no package manager
It also bundles use-package - which is a package manager. And makes bootstrapping emacs on a new machine trivial.
> no changes to the actual binaries of either app
Remember this is emacs. You don't need to rewrite any part of it, you can change anything with advices/hooks, and it is actually easier to do than patching some random packages.
The difference in Emacs vs Vim is Emacs-Lisp. This means Spacemacs is incredibly more powerful than Spacevim from a conceptual point of view.
Emacs is a living, self-documenting environment and Spacemacs extends it quite a lot. This Lisp environment is very different from a set of plugins developed offline or mere configurations.
Calling Spacemacs a bunch of plugins and configuration is like calling any conventional program a plugin of its language and runtime. Plugins attach to an environment with a specific runtime API, instead of extending said API, blurring the line between runtime and extension in the process.
SpaceVim is a modular Vim distribution. but it has same idea with spacemacs. the goal of SpaceVim is to make user get a better experience of vim/neovim.
it's emacs with vim controls. FWIW, I wouldn't compare emacs with vim in this set up, but with something like tumx instead (and imo what emacs offers is better than tmux).
All these "distrobutions" are going to do is further fragment the already extremely confusing terminal-text-editor world. There are so many configurations that it's never going to make sense to me.
If it was simple, there'd be an answer to this question: how do I setup Vim/Emacs as a fully feature complete IDE for C, C++, Rust, Go, Java, JavaScript (Node or Web), C#?
By feature complete I mean autocomplete, error detection, built in one-button-runs, automatic config/sane defaults.
Keep in mind this is a dream list. It's perfectly fine if the list only includes 2-4 of these languages but it needs to have all of these features.
Looking at this from the outside (people who use IDEs) in (onto the people who use Vim/Emacs) we have maybe 1 IDE/Language and that's confusing for us. I can't imaginge having 3 text editors all of which have different common dotfiles.
> If it was simple, there'd be an answer to this question: how do I setup Vim/Emacs as a fully feature complete IDE for C, C++, Rust, Go, Java, JavaScript (Node or Web), C#?
That's the promise of Spacemacs:
- clone their git repository into your .emacs.d directory
- run emacs once, answer three basic questions about your preferences
- add your list of languages to the "dotspacemacs-configuration-layers" list in your .spacemacs
- I even looked up the syntax for you: "c-c++ rust go java javascript csharp"
- make it reload the config
You're done. All this should take less time than it took me to look up those "layer" names in the docs.
> By feature complete I mean autocomplete, error detection, built in one-button-runs, automatic config/sane defaults.
Yes, all that is the promise of Spacemacs.
If anything, I found it to be too much of a kitchen sink (it tries too hard to be "smart" about balancing parentheses for me etc.). Still, you might want to give it a whirl if you have 15 minutes. I ended up going back to vim for almost everything, and I'm on the line on whether to use Proof General from plain Emacs or from Spacemacs. But the time to check it out is not wasted.
that question doesn't even make sense. What is a "project"? My Rails project is going to look radically different than someone's Go project, than a Rust project, than an iOS / Swift project....
if you're talking "project" in the Atom / Sublime / Eclipse sense of the word then a "project" is just a folder. If that's what you mean by "project" then yes, you can create folders in a text-based ui, and you've been able to do so since the dawn of unix, and basically every editor out there has an easy way to either create a folder with some clicks or enable direct access to the underlying shell to create one.
Personally I'm thrilled by the fragmentation in this space -- I've been using Vim since 4.x, and vi before that. I've never found the plugins particularly enticing; every time I try using things that are outside the core distribution, they end up taking more time to maintain than they're worth. Since I mainly stick with the core distribution, Vim has felt pretty stagnant since it got code folding and syntax highlighting, more than 15 years ago. I've been trying alternatives -- spacemacs feels bloated, neovim is still carrying a lot of baggage (e.g. support for vimscript). So far Kakoune is the one I'm most excited about, to the point where I am right now updating my neglected Cygwin installation so I can try to build it for Windows. (Kakoune needs unix-style select, so native Windows builds are unlikely.)
> By feature complete I mean autocomplete, error detection, built in one-button-runs, automatic config/sane defaults.
By and large, I find features like these distracting to the point of unusability, which is why I like living in stripped-down editor land. Different strokes for different folks.
Exactly, and I don't think vim is a good platform to build an IDE on, despite a long history of people trying. I'm not an emacs guy, so I'm not speaking from experience here, but my impression of emacs is that it's already an IDE in the terminal. It's got a bit of a reputation for being a massive pile of elisp hacks, so maybe guile-emacs is the future there?
There is a learning curve, but at least with MELPA setting up Emacs is pretty easy. Company is a good autocomplete backend, Irony is a good back end for C/C++, and Company-Go and Emacs-Racer will do you for Go and Rust. Each of these can be installed by using interactive commands (M-x package-install), and configured with a few lines in your init.el (almost entirely just copying and pasting from the Github front page of the respective package).
I like the fragmentation, because different people have different needs. E.g. I prefer a much more minimalistic setup than my colleagues who use IDEs like Eclipse. I've never been able to comfortably use an IDE. And not a single vim config I found on the internet was right for me.
That said, I think many would agree that for C/C++ YouCompleteMe is a must-have. It has ctags functionality (but without false positives), error detection, autocomplete... And probably lots of other features I'm ignorant of.
There is a semi-official vim plugin for go. I used to use it when I was playing with this language. Same for Rust. I don't know about other languages you've mentioned.
As for the "distributions". I prefer to configure everything myself, but I get it that others might want everything preconfigured for them.
PS. Yes, I'd agree that vim has insane defaults. All vim config files I saw have a common part, I think.
This is generally what roadblocks me when I try to pick up vim. The part I hate about atom is reaching for my mouse. But I like that I didn't have to sacrifice my firstborn to get a sane setup for three langauege simultaneously.
I was in a similar boat. The reality for vim is; generally, the syntax checkers and code completion plugins are hacks, honestly, go read the source code. When I did I was aghast that people were using and promoting them (specifically the java plugins). It's been a couple of years so take this with a grain of salt, imho etc.
I think if one really want's to pick up vim (or emacs) you have to set aside the IDE part of your brain. For example, I still use an ide for java, but I use vanilla vim with no plugins for frontend (html, css, javascript, ember cli). There it's a text editor and that's all I need. Plain vim actually has A LOT of functionality. If one learns the vim way, they might find they don't need nearly as many plugins.
I've always suspected that the fastest way to learn vim is to install the default and not copy someone else's ~/.vimrc at first. Edit it whenever you need some specific functionality, and edit often. Also read the :help docs a lot.
But I admit it would be nice if the vim project shipped a sane default setup that was aimed at the average programmer.
I've been using ~5 IDEs for the better part of two years now. I tried to hold my leaning-tower of Eclipse together but that didn't end well. I love Eclipse because it has all of the features I need that many other IDEs don't have yet but at the same time it has one of the worst and least intuative plugin systems (funny how that works out, right?). There's no hope for polyglots I think because nothing will ever be developed for us. Companies are usually Java, Scala, Python, JS shops and there isn't much a need for mutli-language tooling but sadly that's what most of what I need.
IntellJ becomes a hob-cobbled mess of half complete plugins if you attempt to push everything into that single beast of an IDE.
I currently use CLion, PyCharm, PHPStorm, Project Rider and a few other JetBrains IDEs but sadly their autocompletion and features aren't on-par with Eclipse's autocomletion and integration into Java. Nothing really comes close to that in my opinion.
I started using Spacemacs recently for its out-of-the-box IDE-like features but the layers' functionality seems to be fairly hit-or-miss and overall the impression is one of a bug-infested nightmare. Every time I try to set up my environment on a new machine, I get some new set of problems that I have no idea how to solve (disappearing powerline, broken neotree, weird projectile caching behavior totally breaking my workflow, busted syntax checking, and more). I don't blame the Spacemacs folks for this; I actually think what they've done is really cool. Emacs simply is what it is, I think. I don't have time nor inclination to learn elisp and build/fix my own environment.
When it does work, it's great. But the inconsistency is probably going to drive me away, back to Vim. I think that if anyone can build a similar thing for Vim that actually works, I'll use it in a heartbeat.
I have the exact opposite experience with vanilla Emacs. It works consistently without issue.
Regarding plugins you can use melpa and track the latest release of everything at all times.
You can use melpa and update most stuff infrequently save for the ones that are most important to you.
You can use melpa stable and run only stable versions of everything.
You can use melpa stable and run newer versions of a minority of plugins that are important to you.
What plugins you choose to use and and whether you choose stable or unstable could very well lead to vastly different experiences.
The best strategy is to carefully select plugins which are useful to you and high quality, and run the stable versions of most things updating your tools infrequently when a new major release adds something that looks useful to you.
There's a lot of factors here; were you on dev or master? What I suggest doing is using dev, but pinning yourself to a particular commit. Once you see that things are working, using that commit across all your machines. You can have a fork that you never push to, except to fast forward when you feel like upgrading. Just pull from that fork on new machines instead of from the main repo.
The only other thing that can break you other than updating spacemacs is updating packages, but that's a factor with vanilla emacs and with vim. This happens occasionally but not too often.
Also keep in mind: spacemacs is not all or nothing. There are many, many layers that are higher in the dependency tree, that both get less attention and are easier to swap. For instance, even though I mostly develop in C++, I don't use the C/C++ layer from spacemacs. There were things I wanted that were missing (like rtags), and things that I didn't want that were there (like older tags solutions).
I started out by just configuring rtags inside a single simple file, the same way a vanilla emacs user would, and I dropped the C++ layer. I eventually added more stuff and wrote my own C++ layer. I'm quite happy with the result. I still got a ton of useful layers/packages from spacemacs that were easily setup and configured, like evil, magit, helm, company, flycheck, etc. And the whole layer system itself made even the parts that I customized myself more modular.
That's a general problem with many 'frameworks'. When something goes off the happy path, the abstraction is gone and the user is dealing with the underlying abstractions. Elisp for Spacemacs; JavaScript for the hotness of the month, and X11 for Ubuntu.
I don't think it's really possible to build something like Spacemacs for Vim because of the different underlying architectural decisions. Vim is a one-thing-only-and-do-it-well tool. Emacs is 'an operating system missing a decent text edtior'.
I had the same experience with spacemacs, but it made me realize what I was missing with vim. Currently trying to build up an emacs/evil config from scratch.
I don't get Spacemacs hype. I tried it, and never felt comfortable using it. Started with blank .emacs.d, yeah I completely cleard whole configuration two or three times until I hit sweetspot. Now I'm sitting on ~250 loc config on vanilla emacs (including evil), and eveything works like a charm. Plus I learned a lot about elisp. I never liked those bundles and distributions. They always felt redundant, but I can understand the appeal to some people.
The OP's point was not "equivalence", it was a "sweet spot". Even with "quite a few different languages", if the standard Emacs modes are close enough to the sweet spot, you don't need Spacemacs.
Some love bringing their emacs/vim setup to their own perfection and spend a lot of time on it. Other are too lazy for that and are fine with a preconfigured setup like SpaceMacs. Whatever you prefer.
My point just was, for someone like me, SpaceMacs is a truly awesome project.
The point is how much of Spacemacs you end up using?! And in which languages you work? Because, what consumes your time is not setup, with use-package setup becomes trivial really, what sucks you in is learning the plugins and how they work (cider, ensime, alchemist, magit, merlin, those are serious environments not just syntax/autocomplete plugins). Idk if they come with spacemacs, I doubt it, but what then makes the difference if they come with Spacemacs or not, since most of the time you will spent on learning them, and not setting them up.
Spacemacs is so customized that it is like learning new editor from ground up, at least for me. Last year I spent solely using and learning Emacs, to the point where every command and keybinds made sense, before that I used Vim for years. After one year of training I decided to install EVIL, and now I am enjoying Emacs and Vim a lot. Yesterday I downloaded spacemacs, opened it and I was like, ugh, wtf now?! I just go overwhelmed by the amount of preexisting code which I did not understand, and needed to crash a lot of hours to get everything from ground zero.
Just configuring all of the plugins with good key bindings takes a fair amount of time. Just one of the things that SpaceMacs provides.
After a week or two with SpaceMacs I tried to start from a plain emacs and build up my own config, but I gave up. Just too much hassle for me.
I probably would not have enjoyed SpaceMacs if I had used emacs at all before. But I switched directly from Vim with no prior emacs experience, so for me it's a perfect fit.
Yup, you are right. When I deleted Vim and Installed Emacs, I wanted to learn Emacs and Lisp. That said I found configuring Emacs from scratch pretty good for what I want so I sticked to it. But I think that it is not that scary and hard as you might think! In 2 weeks I think you would be standing satisfied with your new humble and clean Emacs configuration. :)
Magit (with evil-magit) in Spacemacs is wonderful. It has the right balance of modal keyboard shortcuts and a nice UI. Took me some time to feel comfortable with it, but now I'm faster with it than in a normal terminal.
Yay for the effort! But personally I don't think this is a good way to learn and use vim. The strength of vim is to customise it after your own needs and you'll do yourself a disservice if you end up with a config that you don't know and don't use more than x percent of.
This is one of the most obtuse project descriptions I've seen. If you're making a...whatever this is...aimed at Vim users, why would you expect them to be familiar with a something-or-other for Emacs?
The main difference is that in Spacemacs, you hit the space bar and you get suggestions of commands to use, then you can narrow those down by navigating those groups. For example, if you want search your project for a file, you would hit the space bar then type "p" (for project). At that point a number of choices appears, e.g. "f" (IIRC) is to find files in your project.
After a while it becomes sort of second nature:
SPC f s - file save
SPC m t f - mode test file (e.g. if you're in a go file it'd know how to run tests for go files).
Another nice thing is that `SPC ?` brings up a prompt that lets you search for commands: you type a keyword for example and you get all the editor commands that match your keyword, so it's easy to navigate your editor shortcuts. That was useful for me, since every result came with the "spacemacs" shortcut, and the Emacs shortcut, so I ended up learning Emacs for free (I was coming from Vim).
Note: I've also tried Spacevim and the only thing it has in common with Spacemacs is the mnemonic key bindings. It doesn't seem to be discoverable (at least not with SPC ?) and it also doesn't seem to be consistent (I couldn't "guess" what certain things were, in spacemacs it's consistent enough that you can guess). Those are the main core pillars imho: http://spacemacs.org/doc/DOCUMENTATION.html#core-pillars
In the long run that's not likely, because vim isn't nearly as extensible as emacs is; thus it's unlikely that the long tail of emacs functionality exposed by spacemacs will be duplicated in vim and those exposable by spacevim.
If you want an extensible editor, use emacs. If you want a vi-like extensible editor, use spacemacs (or just evil-mode on its own). If you want a lean editor, use vim. Torturing vim (and writing tortured code to torture vim) into a simulacrum of emacs just doesn't make sense to me.
… and people wonder why I dislike Markdown. (There are fairly radical rendering differences between the GitHub page and the spacevim.org page, because different Markdown renderers with different bugs have been used, and the Markdown was fairly sloppily written.)
Yes I recognized the markdown fails from my own markdown fails. Especially when you move repos between github and bitbucket, and expect README.md to 'just work', it seldom does. I've gone between md, org and even rst. They are all great, but still suck in curious ways. pandoc helps but it is a whole kettle of fermented fish in itself.
One big hint: leave blank lines between block elements. (It’s the same in reStructuredText, except that it actually complains if you do it incorrectly rather than being silently inconsistent as all Markdown implementations that I know of are.) Not doing that is the main thing that’s wreaking havoc here.
The table rendering issues I suspect are from the lack of a newline after the headers. I don't know whether it's correct or not because, yknow, markdown.
reStructuredText is at least consistent. And extensible. And secure. It has a decent spec, as well. It’s just a pity that it never gained widespread acceptance or other implementations and is thus effectively tied to Python.
I like reStructuredText and work in it regularly. I’m sad that Markdown with all its terrible inconsistency, lack of extensibility, insecurity, &c. has prevailed.
The reasons why Markdown is popular are I think very similar to why HTML is popular. In its first years, it was loosely specced, very liberal in what it accepted and was parsed differently by different implementations (browsers).
When this actually became a real problem, HTML5 came out to spec it much more rigorously.
RST in that picture is XHTML. Rigorous spec, but very picky with the input. Typo? Your whole document won't render.
I've been bitten by XHTML and I've been bitten by RST. The experiences are very similar. I agree with the need for a more rigorous spec but like with HTML it still needs to be liberal with its input.
So that's what we need, and I just hope it doesn't take years to come into existence. An HTML5 for markdown. And we shall call it MD5!
Tried Spacemacs 3-4 times because of org mode. Most specifically inline code block execution, through org-babel. Still not ready to change over from vim, though.
What I haven't done is check neovim. Any comments regarding the potential for inline code execution?
Org-mode comes standard with plain emacs (no need for all the extra bells and whistles that come with spacemacs, necessarily). Since you're a vim user, try installing evil as a first plugin, and only adding additional ones as you see fit. I found that spacemacs did too much and preferred vanilla emacs + evil + a few python plugins. Still, I mostly use vim for smaller scripts. Emacs is better as an IDE in my opinion.
In my experience, you really can't beat Org-mode when it comes to inline code execution. You can preview LaTex and plotting inline as well. Plus, you can of course export the document into a variety of formats (HTML, PDF, etc.), which are quite customizable thereafter. Here is a document I put together with Org-mode for a short C++ seminar [1]. All the code is executable while editing the document in Org, which is nice for a number of reasons. I also added some custom CSS/JS for the HTML export.
vim sh mv ~/.vimrc ~/.vimrc_bak mv ~/.vim ~/.vim_bak git clone https://github.com/SpaceVim/SpaceVim.git ~/.vim
nvim sh git clone https://github.com/SpaceVim/SpaceVim.git ~/.config/nvim
Why do people always expect the average joe user to understand this? What do I do with this? Do I run this in the terminal? Then what? First line opens Vim and it opens 11 files. Now what? This seems like the average explanation that works for the person who writes it, but is useless unless you know this already.
for vim do: sh mv ~/.vimrc ~/.vimrc_bak mv ~/.vim ~/.vim_bak git clone https://github.com/SpaceVim/SpaceVim.git ~/.vim
for nvim do: sh git clone https://github.com/SpaceVim/SpaceVim.git ~/.config/nvim
This is awesome! For someone coming into the more "conventional" editor space scouring github/internet for plugins is a huge pain and its even harder to find ones that cover your cases and are kept up to date. Having this all rolled into one as and having it all work together in harmony is a big win for on boarding people to vim!
Disclaimer: I haven't tried this package but it does seem to have a lot of promise.
This uses NeoVim, for a similar bundle using Vim there's some options like Steve Francia's https://github.com/spf13/spf13-vim (called Ultimate Vim Distribution)
The greatest dev timesaving tip that someone gave me was to clear my .vimrc and start off with a blank slate. vim is a language in itself [1] and it's easy to try to mimic someone who has conversational fluency in it, but as with languages, the only way to learn in practice from the start. Projects like SpaceVim try to create an IDE like environment, but because of being terminal driven, it's difficult for anyone to "discover" features. You end up with a large distribution with barely used features. Instead by growing and maintaining your own vimrc, you get something that can be both used everywhere (dotfiles and Github FTW) and work FOR you. At the end of the day, vim is JUST another tool to make you work more efficiently.
Trust me. "truncate -s 0 ~/.vimrc" and take the time to learn. You won't regret it in the long run.
Have you really saved time, though? I know next to nothing about elisp, yet I can be perfectly productive with Spacemacs. Being able to edit my .spacemacs file and add a functional layer for an entirely new programming language with sane defaults seriously beats fighting configuration files to do the same thing anyways.
This sort of 'configure your own' extremist software purity movement is great for some people, but if you want a setup that meets your needs 95% of the time, and works well, Spacemacs is great. I'm sure learning to configure your own vimrc has been a rewarding experience, but I don't think you can really argue it's a time saving measure.
When I forked my dotfiles project I took the time to learn what most of the changes, commands, and config did. I removed a lot that I didn't find relevant. But I don't think I'd have gotten near to the current state if I'd started from scratch.
I think there's value in learning what prepackaged configs provide and tailoring them. I don't necessarily agree that starting from scratch is optimum for most though.
I don't quite understand this point. Why does the exact starting point of tinkering (existing .vimrc vs a completely blank one) matter, unless you somehow think the empty .vimrc defaults are the most sensible?
Like, there could be a bunch of features that are barely used and not discovered, but seems like the easiest way to discover new features is to scan through a good .vimrc, rather than sleuthing through online resources.
Starting with a blank slate lets you refer to just the program's documentation to configure it, while if you start with a config bundle you'll have to refer to both the program's documentation and the bundle's documentation, and you'll have to be decently familiar with both how the program works and what the bundle does in order to make creative customizations.
Using a configuration bundle makes the learning curve less steep but more tall, essentially.
I've seen this sentiment here and there but I've yet to appreciate it... I've observed co-workers that do this and while they do get very handy with their selected subset of tools, they tend to use far less of the available ecosystem than a promiscuous copy paster, or a framework user, like me. I think its got a lot to do about temperament - some people like more control over less, and find the secret squirrel magic below somewhat sinister. There is no way that I would be able to build spacemacs from scratch, but I do benefit a lot from riding the integrated magic beast, even if it sometimes go into holes no amount of cursing and praying can prevent.
> ~/.vimrc works on bash, but it hangs on zsh because it treats it like cat > ~/.vimrc (> without any command in front of it is not well defined in POSIX)
So portable way to do this is like this:
I discovered recently that the default install of vim on Ubuntu (and I guess Debian in general) includes some customised configuration (/usr/share/vim/vim74/debian.vim), which makes it a bit harder to grasp what's going on.
FYI, Vim 8 ships a defaults.vim that set up some nice defaults for new users, and an empty .vimrc file disables processing of these files. A 'null' .vimrc looks something more like:
Oh, yeah! Good old curl | sh via HTTP. Fucking awesome. I wonder how many people get unknowingly pwned while executing such curl | sh instructions. It's not hard at all to MitM HTTP, detect if shellcode is being transmitted, then add your own instructions to it. Or you can just compile a list of known sources that encourage to execute curl | sh and MitM only them if you want to make your exploits even more discreet. What a nice way to get yourself your own little botnet. Way to go! There's a catch though. You need access to infrastructure of ISP. But is it that hard to get such access? No.
For me the insecurity isn't even the worst thing about curl | sh.
I like package managers, and I tolerate tarballs, because I know what they do and how to reverse it. I care about the organisation of my filesystem and suspect that the people who suggest I pipe their script into my shell do not care at all.
I'm so exhausted with the top comment being some off topic rant that doesn't contribute anything to the actual discussion but attracts a lot of reactions because it's easy to say something about it...
And yes, I am aware of the irony of this being completely off topic. ;)
curl sh is no more dangerous than running any script you haven't read, or binary you haven't decompiled for that matter. The odds of mitm are exactly the same.
There are 2 types of "any script you haven't read, or binary you haven't decompiled": 1. scripts and binaries you obtained from somebody you trust in a safe authenticated way; 2. everything else.
When I update my Debian distro, I get binaries via HTTP, that's true. But I also have the public keys of my distro's maintainers whom I trust. The authenticity of every binary I download is automatically checked using those public keys. It's an example of "any script you haven't read, or binary you haven't decompiled" type 1.
curl | sh via HTTP is an example of "any script you haven't read, or binary you haven't decompiled" type 2.
curl | sh via HTTPS is an example of something that is very-very close to "any script you haven't read, or binary you haven't decompiled" type 1.
The presence of https prevents most mitm vectors. It doesn't prevent the far more likely scenario of the website in question being broken into and serving malware or backdoorware.
Point is, don't run binaries you don't have some way of vetting. curl sh is no worse offender than "download link here".
When criticizing someone, it's important to take even more care to be civil, and to do your best to ensure there's little chance of being misinterpreted. One can easily interpret what you consider irony to be snark, which is definitely uncivil.
In my opinion, if one is truly trying to be civil, the response here would be more along the lines of an apology for being unclear and misunderstood, rather than defending the action.
As for your comment being deleted, that was not the action of any single individual (and from my experience, definitely not a mod): it's the result of actions of a number of users in the community. Asking for others to take action on a particular comment—yours or otherwise—goes against HN guidelines.
> that was not the action of any single individual (and from my experience, definitely not a mod): it's the result of actions of a number of users in the community
Judging by how fast my comment above got deleted, it's definitely a mod or a group of mods. By "mod", I mean any privileged user. BTW, that comment was not uncivil and it was also substantive: I raised an important issue about the HN community. Yet it got deleted.
> the response here would be more along the lines of an apology for being unclear and misunderstood
I don't think there's any sense in apologizing when everything I say gets deleted.
Judging by how fast my comment above got deleted, it's definitely a mod or a group of mods.
Do you mean an organized group of users who are downvoting? If so, I haven't seen evidence of such a group on HN. There is a strong reaction by many community members to what's perceived as uncivil behavior, as the HN community as a whole values the quality of the forum.
By "mod", I mean any privileged user.
The HN mods are dang and sctb. They are indeed privileged users, as they have the ability, among other things, to ban users. They've also got a lot of responsibility, which they take very seriously and with — from my perspective — great care.
You're admittedly new to participating on HN (or at least this is a new account). Please take some time to reflect on the community reactions to your comments, and how your personal views of civility and substantiveness may differ from those of the community as a whole. This goes as well for assumptions of how the community works and behaves. This isn't meant as a value judgement of your views, just that they appear to differ, if at least a little. After all, this is the community you'd like to participate in. You can also observe how others behave on the site to better understand community standards.
Please take this in the spirit it's intended. As one HN user to another, I'd like for you to have a positive experience as well as contribute constructively to HN.
If you have questions that aren't answered by the guidelines and FAQ, you can contact the mods by email (see the contact link in the footer). I've also found general search engines (e.g., DuckDuckGo, Google) and the HN search tool (in the footer) very useful to find answers to many questions I've had about the HN community.
Do you never install any software outside Debian's main apt repos? Never add a 3rd-party apt repo for anything that maybe you didn't trust as much? Never clone a git/hg/svn/etc. repo from somewhere and build & install the software yourself?
Ok, let's say you don't do any of that. The Debian developers are still fallible, and they certainly don't audit the source code of everything they package. Sure, you have to trust something, and I'll certainly agree that it's more reasonable to trust the Debian packagers (and their distribution infrastructure) than a lot of other things, but saying "curl | sh" (when you're curl'ing over https at least, from a website you can reasonably expect not to be compromised) is always bad is a bit extremist. At any rate, the shell script you download is there for you to inspect before running, and if you're not happy with it, you don't have to run it.
Personally, my main beef with that sort of thing is that I want to know if the script is going to accidentally clobber any existing files, or install things to a location where I might not prefer them to be installed.
Edit: I'm only now realizing that the OP has you curl from a non-TLS webserver, which I imagine is the first thing that got you upset. Ugh. Asking people to install something that way is just irresponsible.
> The Debian developers ... don't audit the source code of everything ...
Strawman. If an amount of critical code is audited (and it is) it's still much better than nothing.
> I want to know if the script is going to accidentally clobber any existing files
Packages are tested on various hosts and architectures. The packaging system check for files being overwritten and that the package can be removed cleanly
Also suspicious things (e.g. unsecure file permissions) are checked. Sandboxing tools are often used to contain daemons.
Furthermore the package content is tracked, while "curl | sh" cannot guarantee that the same script will be received every time or by every user.
> Strawman. If an amount of critical code is audited (and it is) it's still much better than nothing.
Is it, though? I'm specifically thinking about the Debian fiasco a few years ago when a packager broke OpenSSL's key generation. Clearly less scrutiny is paid than one might think. I'm unable to find any evidence/documentation/anything that suggests that code audits by Debian packagers of critical packages are done regularly (or ever).
I did find a few links to some Debian-specific tools to aid code auditing, but nothing to suggest where they're used, how often, and on what packages. Regardless, they look more like linters and static analyzers -- nothing that would help you discover backdoors or just flat-out malicious behavior.
> The packaging system check for files being overwritten...
Yes, I'm well aware, not sure why you're bringing this up. I was merely pointing out (regardless of any other argument being made) that file-clobbering is a reason why "curl | sh"-style installation bothers me, personally, much more than possible security considerations, which I consider to be overblown.
> Do you never install any software outside Debian's main apt repos?
Only the official repos, yes, because anything else would be insecure.
> Never add a 3rd-party apt repo for anything that maybe you didn't trust as much?
Heck no, never, ever. Not even once. That's insanely foolish. Frankly, I view, 'please add my PPA/repo to install' as a different way of saying, 'I don't know enough about security for the software I write to be installed on your computer.'
> Never clone a git/hg/svn/etc. repo from somewhere and build & install the software yourself?
I view that as somewhat different, given that the source is there and has a weakly-cryptographically-secure history (weak because it's SHA1), so that if someone ever did something bad then it'd be easy to prove it.
Cool, then you actually are "paranoid" enough (sorry, not trying to use that word in a pejorative manner). I frankly don't think most people are, and I don't necessarily think that's as dangerous as some might make it out to be.
How do you end up dealing with things that aren't packaged for your distro? Do you end up downloading source, doing some checks to whatever level makes you comfortable, and package yourself? Or do you mostly either not find yourself in that situation, or just take an "oh well, I'll deal without it" attitude.
I use a few 3rd-party repos, though ones that I would consider more trustworthy than a random PPA (for example, Google's Chrome apt repo). I certainly don't trust them as much as Debian's official repos (but, again, I'm not sure how much that trust is actually rational!), but I'd consider it unlikely that Google would get compromised or slip something nasty in (and even if they did, it's not like the Chromium source in the Debian repo has been audited).
> I view that as somewhat different, given that the source is there and has a weakly-cryptographically-secure history (weak because it's SHA1), so that if someone ever did something bad then it'd be easy to prove it.
That's all well and good, but that sounds like an after-the-fact reactionary thing. It's little comfort to be able to prove something bad happened after you've been owned.
I just feel like most of desktop security, even on Linux is on pretty shaky ground, and we greatly overestimate the care we take when installing software. I think you're probably ahead of most people by using only official repos, but it's like the classic analogy of a chain with a single weak link: all it takes is one "git clone" of something that does something malicious, and that's it. Sure, the probability of a successful attack is reduced by avoiding 3rd-party repos, etc., but attack possibilities are still very much there, and it feels like people don't seem to see that.
But, overall, yeah, the state of desktop security wrt software installations is so much better on nearly any Linux distro than common practice on Windows or macOS, it's crazy.
No, installing a package using my package manager will
a) ensure this exact binary/script is GPG signed by someone I ostensibly trust
b) ensure it's not been tampered with by a MitM between the hosting server and my computer
This reduces my risk exposure to "creator of software X (or distro packager Y) has turned rogue", which is many orders of magnitude less likely than "website of software X got pwned, or someone is MitM-ing me".
Assuming script being fetched by cURL is hosted using HTTPS that is reasonably well configured it basically provides the same guarantees.
Unlike a distro package you can also just -not- pipe it to the shell immediately and actually read it.
Even if you did unpack every distro package source before you installed it with the amount of crap and boiler plate you would almost never be able to work out -exactly- what it will do.
The way I see it the "curl | sh" people have found somewhat of a middle ground. It achieves similar convenience to "apt-get blah" but with simplicity level closer to binary tarball.
It's almost as easy to work out what it's going to do as the tarball but it will also do it for you automatically.
> Unlike a distro package you can also just -not- pipe it to the shell immediately and actually read it.
IDK about other distros, but for the one I use (ArchLinux) it's trivial to inspect the install script, and they (PKGBUILDs) are usually well structured and easy to read.
Unfortunately the majority of package managers are bloated and awful. Pacman/PKGBUILD and emerge/ebuilds are notable exceptions, though there is also Nix and many super simple based tarball systems that are also very good.
But all of that is irrelevant when 99% of the Linux world is running RPM/APT.
You trust the distro maintainer and have a security mechanism to ensure the scripts and binaries have not been tampered with.
So virtually no risk of MitM and very little risk of malicious scripts and binaries.
Reading script and decompiling binaries does not imply you understand every thing they do, but it does imply that you have an unlimited amount of time which is impractical at best.
No, it really doesn't. I was not comparing to distro packages (which are an excellent solution to the problem, like I said elsewhere) but to download links.
I know of the attack. It's very unlikely to matter in practice.
There's a ton of potential issues when you're curl-piping; all those issues are present in some form when you're simply doing curl first then sh.
It's far more egregious to download a hard-to-inspect binary then run it, than it is to download a shell script then run it. Although it's harder to inject code on the fly, if you're at the stage where you're worried about these kinds of attacks it makes very little difference.
I don't see people bitching about https://www.terraform.io/ offering download links for example. So yeah, it gets tiresome to always see "oh! curl|sh! boo! bad!" - there's actually no better way of sharing such scripts short of making and vetting a package with a high-entry-bar distribution. And a two-step curl + sh instead of a pipe is security theater.
https would certainly be appreciated on that site but that's a general issue, regardless of the download script.
if you believe shell scripts are hard enough to write that systemd can take over. Then you should also believe that they're hard enough to inspect that you want some level of auditing.
RPM's are inspectable, signed by default and integrate better with your system. (the same is true of apt). And if they're in the package repository then there is a maintainer who is ultimately responsible for ensuring the quality of the code.
For the second time here, I'm not dissing package managers... I even explicitly said the solution is "vetting a package with a high-entry-bar distribution".
But the whole point of high-entry-bar distros is that it's highly curated. Some random script from the web will not be in there. How do you serve those that need to casually distribute trivial software?
There's some answers to that question - they're not widespread at all. It's one of the major failures of package management on Linux. macOS gets it mostly right.
Are you claiming that macOS has better vetting of its packages than Linux? And what do you mean by "Linux" anyway? -- lumping RHEL in with ADistroIMadeInMyGarage is a bit odd.
Well, linux doesn't have adhoc package management.
Because there is no standard packaging format between distributions, the best people can do is release a deb that may or may not work on either debian or ubuntu. Or an RPM that may or may not work on RH/Fedora. Or a tar.gz that will contain sometimes source code, sometimes a binary, and if it has source code there's no clear way to compile it, if it has a binary there's no clear way to install it.
So instead, curl sh.
It's pretty ridiculous that it's still a problem, tbh. What we need is somebody with the right political clout at either RH or Debian to sit down with devs of the other, come up with a decently generic solution that fits both systems and a backwards compatibility policy. It doesn't have to be perfect nor the best, it just has to be good enough to support the use cases of most Linux distros.
Once both Debian and Fedora support it, Ubuntu (and its many flavors) and Red Hat will follow. And if it's a generic enough system, with a pluggable-enough backwards compatibility policy, it's not unlikely for it to be adopted in more minor distros. If it's sold correctly, distros will gladly adopt it - we've all been waiting for this a long time.
I guess I do not understand the manner in which you use "ad hoc" to refer to package management.
It seems as though you are now talking about cross-distribution package management. It is further confusing that you are contrasting all "Linux" to the single distribution macOS.
To me the term is "ad hoc" current w.r.t. NixOS , GNU Guix and other purely functional systems, and there the distinction is made between:
1) declarative -- in which a rebuild of the system will ensure the package is present and configured as expected; and
2) ad-hoc -- in which the user can affect all of the system with side effects and rebuilds will not result in the same state.
Why would you want to allow users to randomly and aribitrarily affect all of the system and end up in an unknown state? Isn't that exactly what curling shell scripts achieves? And what method does macOS implement in order to avoid this?
If all you're talking about is cross-distro package management then you and your users can either give up on this and standardize on one OS and just pony up the cash for RedHat (same as you do for macOS) or else start writing Flatpaks.
What you have is an acute case of righteous rage against a flaming straw-man. The only medicine against this affliction is girding your loins and guarding your mind against those seemingly plausible but ultimately horseshit security theater tropes, so popular among the impressionable today.
Attacking curl yielding nerds is not going to get you a botnet.
I think scrollaway is here to troll. No need to reply to him seriously. You may reply to him in a humorous manner or just scroll away from his comments, just like his nick recommends us to do.
Personal attacks are not allowed on Hacker News. You've already broken HN's civility rules more than once. Please (re)-read the following and stop doing that:
Is it a vim plus a bundle of plugins?
EDIT: Cleaned up the spacemacs sentence.