Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Would you work at a startup that requires you to use Linux?
34 points by rabh on Sept 22, 2010 | hide | past | favorite | 95 comments
/throwaway account

All of our systems (development, live, etc.) run Linux with a relatively complex configuration. We're thinking of issuing our first new hires (all software devs) pre-configured Thinkpad T Series (http://shop.lenovo.com/us/notebooks/thinkpad/t-series) and only allowing local development on those machines.

We suspect it is possible to get our development system running in other operating systems (OSX/windows), but we have no idea how long set up will take and prefer model and OS standardization.

I realize some 2/3 of you use OSX. Are you particularly attached to the Mac and/or its OS - or would you willing to work with a Thinkpad/Linux development environment?

Note: We're cool with anyone choosing their IDE, GUI, etc. We'd just rather not deal with A) buying and managing radically different laptop models, B) getting packages our system requires working in other operating systems, C) insecure systems (e.g. HD encryption is a must).

Extra question: A more general question is how often do startups mandate using their issued equipment for work? I've seen this done at every place I've worked thus far but am not sure if it is generally true.




If your stuff runs in Linux, your stuff runs in Linux - people can deal with that. I love OS X, but will use anything that's necessary to do a job...I have a desktop at home running Win7/Ubuntu specifically for that purpose, though I almost never touch it except to manage long-running tasks that I'd rather not tie my laptop up with.

Mandating that all dev must be done on a particular Thinkpad model, though, is a bit of a company smell. I can't think of many good reasons to prohibit other machines from being used.

I'd suggest the Thinkpads, by all means, but I wouldn't ban everything else; unless the hardware requirements are very specific, different hardware should make no difference at all, as long as people have the same software running.

Even as far as OS goes, if someone's able to get everything going on their own time on their own Macbook, is that really a problem worth worrying about? Provide the standard config, but if someone wants to go above and beyond, who cares? They've always got the standard rig to fall back on...


> I'd suggest the Thinkpads, by all means, but I wouldn't ban everything else

The author specifically mentioned the need for a complicated setup including hard disk encryption. I take that as meaning the development environment includes some company secret that should be protected as if they were nuclear launch codes. Which, in turn, is difficult to do at all and hideously difficult to do on your own.

I've always been both a happy Linux user and a happy Thinkpad user, but I guess it'd be a productivity boost if you let people choose between different models (e.g., a 15", a 14" and a 12" model, because people have different tradeoffs between mobility and primary screen size).

And if the contents of your development environment has vaguely less security impact than nuclear launch codes, you should consider how much productivity exactly you're willing to forego for reasons of paranoia. (Using a different development environment is pretty much bound to lead to distractions because you need to actively think about key combinations etc., especially when switching between the braindead-but-largely-consistent Mac key combos and the more-standard-but-sometimes-less-so Linux app key combos).


http://news.ycombinator.com/item?id=1712035

The distros have killed Python

Just a counterpoint to "If your stuff runs in Linux, your stuff runs in Linux"


It's a terrible counterpoint. If your stuff runs on linux, it's perfectly possible to install a second python and use that (if you need any of the features added recently. Which most of the time, you probably don't).


No it's not - also from the linked article

Not only that, but in many distros if you "upgrade" you can actually destroy base systems needed to manage the OS. Because upgrading Python is either painful or nearly impossible...

Point being that it is not unreasonable for companies to expect homogeneity and focus on innovating on features, rather than supporting dev environments.

In most cases, it isnt a problem, but often it turns out to be. For example, my linux machine has vm.overcommit_memory set to 0, which Redis aint comfortable with.

What I notice, however, is that people dont mind when they are "forced" to work on OSX (and given free Macbooks). I wonder if it is truly standards "Nazi-ism" that people are concerned with or is it simply a case of teh shineeyy.


> Because upgrading Python is either painful or nearly impossible...

It's not actually that difficult. You need not upgrade your python. You just need one more python. I have been successfully using this setup on RHEL 5 and Ubuntu. And you can couple it with virtualenv and pip to have a sandbox. I would say it has become standard practice in the python dev community.


I don't think I saw it mentioned here, but one thing that you should consider is that restricting the platform can give the impression that you are standards Nazi's. I left a company for just this reason. I ran the web shop and could not even put a Mac on the network "due to security reasons" to test Safari (back when it was a mac only product). Later when I wanted new software (CMS, Source Control) I had to comply with company standards so that we could leverage purchasing agreements and political deals. There was never a focus on the right tool for the job but rather decision by committee and back room deals.

It was a painful experience and one that drove talent right out the door. I literally saw the best and the brightest no showing for work, never to be seen again (not on my watch). At the career level of a senior developer, if you have people no-showing with no notice you have a serious problem.

Anyways, point being, when I see we use platform X and only X. It translate into my mind that a company has clueless management and a bunch of lazy admins who don't want to do their job. Nothing will drive developers away like admins putting restrictions on how they get there work done.

Be very careful of the image you present to prospective hires if you restrict their system choices. Developers and designers cherish their freedom to create, if you restrict it you could drive the best and brightest away from you company.

Also, don't take this as I am saying this is what you guys are doing. I just want you to be aware that it can send this message, as I am sure a good deal of senior technical people have similar war stories to my own.

Personally I would adopt a virtualization technology and tell the developers it has to work in you officially sanctioned virtual. How you get there is of no consequence to me, so long as it is a reproducible process. Tell people what to do and not how to do it, the how is what you hired them for.


I wouldn't work for a business that didn't trust me to choose my own tools. This includes things like locking down the system so it cannot be customized.


I edited the summary to note that we are not (entirely) blocking system customization.


Blocking system cofiguration is a totally different issue. A lot of coders are especially tied to their OS of choice, and with Linux, even different distros can be night and day. There are some distros that I don't think I could reasonably work with (not to name names). And I suspect a lot of coders are pretty inherently mistrustful of what might be lurking in a preinstalled system that they aren't allowed to wipe. (Keystroke logging? Remote VNC? Is the camera on right now?)

I haven't worked at a place yet that didn't budget a little time up front for me to get up and running, but it's mainly the feeling of restriction that gets to me. What if the coders get upset enough that they start developing remotely, pushing all their code back and forth over ssh/rsync/whatever, and using their laptops as thin clients? Do you block ssh traffic? Ouch. Lots of coders are brilliant and love subverting restrictions. It seems keeping everything locked up could turn demoralizing for both sides to say nothing of the huge time sink.


I wouldn't particularly care that it's linux but I'd sort of take it as a bit of a red flag that even as a startup you're so obsessed with controlling your employees. One of the reasons people want to work at startups is because they get to be more independent and free. It also tells me that you're planning on hiring a low standard of employee who is not capable of figuring out deployment issues on their own platforms. Seriously, if you're hiring half good people they will be figuring out and telling you how to deploy to whatever platform they like best.

FWIW, my preferred development environment is still ... Windows. Even though I spend most of my day typing unix commands (either to servers, VMs, or in my Cygwin prompt). I have about a hundred utilities that I've slowly cultivated over the years and these make me as efficient as hell. I'd actually like to have a good reason to switch such as a company mandate though :-)


Could you share what your utilities are?


Honestly I wouldn't work at a startup that required me to use anything but Linux.


I dream of the day I'm allowed to use Linux as my main OS. Maybe time to move jobs.


We're allowed to use anything. The way it's fallen out: 1 person with Linux on PC, 2 with Linux on Macs, 3 on Macs, 2 on Windows 7. As others have said, virtual machines are your friend.


You should totally ignore the cost of the laptop and managing it. It's a really stupid thing to worry about when the productivity of your hires is 10000x more important.

All great programmers have spent many years programming. They know how to make themselves productive. Let them buy what they need (within a budget).

Even big companies could do this. They're just too stupid. There's no excuse for startups.


> Let them buy what they need

Sure, here's $700. Go buy whatever you need.


Compared to a hacker's salary, the difference between $700 for a Thinkpad and $3000 for whatever they want is negligible.


Maybe's its because I've been working for 20+ years but I actually care very little about what OS I have to use - just about every contemporary OS is "good enough" to get work done on.

However, obsessing about this kind of stuff just seems kind of wrong for most start-ups - the amount of time you spend thinking about this kind of thing probably doesn't give you the reward you think it does and you might well miss out on the extra benefits of testing stuff in a more heterogeneous environment.

In this day and age thinking about this kind of stuff without considering VMs is silly - they are more than fast enough to use as day-to-day dev environments.


Regardless of the Linux requirement, these things stood out to me:

"only allowing local development on those machines" "insecure systems (e.g. HD encryption is a must)"

That kind of restriction had better come with laws mandating it. I would not work for a company that doesn't trust me to use my own machines to do work. I think that you're worrying about "managing radically different laptop models" shows that you aren't trusting your devs to be competent enough to manage their own machines.


Requiring hard disk encryption doesn't sound unreasonable to me, especially considering the speed and convenience impact these days is pretty minimal. Laptops are stolen/lost all the time.

That said, I'd be hesitant to work for a company that demanded I use specific software for no apparent reason. There are plenty of good reasons, of course, though it's not clear whether any apply in this case. Either way, I don't think I'd tolerate not being root on my development box.


> you aren't trusting your devs to be competent enough to manage their own machines

I can manage my own Linux machine and have been doing it for the last 7 years, but it would be nice to have someone competent enough in the company to which I can delegate such tasks.

Considering the shit you have to do to get some hardware working properly on Linux, having a standard selection of hardware is actually pretty good (I have been giving this advice for years), and if that means the company "doesn't trust" me, then I could do without that trust.

Also any company will also have administrative personnel or sales/marketing people, i.e. not everyone is a Linux-guru. And going Linux (instead of OS X / Windows) makes sense for a startup since it reduces the costs involved, but that's only if you can fix your employees problems with their setup, otherwise it becomes a serious time-sink.


I'd guess that HD encryption is there to prevent theft of data if a laptop gets left in an airport lounge. It's a bit paranoid (really, nobody's trying to steal your source unless you are very successful), but it's not a terrible idea.


Standardizing development on a very specific environment is not unreasonable and is actually fairly common. If you know what sort of environment your software will be running in, then it makes sense to capitalize on that and avoid wasting effort - getting your dev system running in a zillion different environments, with ever-so-infuriatingly-slightly different versions of make/bash/whatever will chew up a ton of dev and testing time for no real benefit.


I do everything in linux already (development, web surfing, documents, etc), so I would have no problem with using linux at work.

I think there are more Linux users on HN than you realize. I found this poll: http://news.ycombinator.com/item?id=687267 which seems to indicate that there are roughly the same number of linux users as OSX users.


I suspect a lot of people using OS X would be perfectly fine with Linux, but Apple's hardware and support go so far above and beyond what you get from most stock Windows laptop manufacturers, it just feels like you have a better product in your hands when you buy a Macbook Pro than it would if you bought the relative beast of a machine you could get if you went for something non-Apple.

And OS X works well enough (and does things in just a Unix-y enough fashion) that there's no really compelling reason to use Linux on the Apple hardware. Not to mention those of us that do iPhone dev...


I agree that for most development projects, OS X will work fine. We've had interns in the past though who failed to get even the basics of our system working in os x. The complete one is guaranteed to not run due to kernel modules we use that are only available for linux.

On the hardware, is not receiving Apple's a dealbreaker? Apples is definitely quite nice - and if we issued someone some low-end netbook I would definitely see a morale issue that could occur. But a Thinkpad T-series is very high end (imo, better than a macbook when using linux)


Not at all a dealbreaker, no - if you're buying the systems, it makes sense to buy what you are able to afford and support the best.

I was more responding negatively to the idea that other machines wouldn't be allowed to be used in development at all (personal machines, for instance, which almost every person worth hiring will have at least one of, and in many cases, which they'll vastly prefer to use), but I re-read, and realize that you mentioned security concerns. That changes matters quite a bit, assuming the security issues are really important (having an internally-perceived killer app and worrying about it getting out -> not necessarily very important; working in an industry where competitors regularly steal trade secrets [finance and gambling, for instance], or where regulations/investors demand tight security, -> important!); I would usually be annoyed if I was not allowed to use my personal machine for dev, but if there are legit security requirements, it's a lot more understandable, and I wouldn't have any problems at all using a standardized stack.

Honestly, from my point of view I'd be fine with any OS restriction that's not Windows. It sounds like OS X is not a real option for the work that you need people to do, so it's perfectly reasonable to insist on Linux, and the Thinkpads are very nice machines (I actually quite like the "nub" for mouse control), so I don't think you'll hear too many complaints if you offer them.


If you're mandating a specific OS then you're going to risk losing talent that prefers other OSes. On the other hand you will attract people that like said OS.

At our place we don't give a rat's behind about what you run in a VM, but your main OS is either Windows 7 or OSX because it has to run PGP WDE or Becrypt, and neither play nice with multi-boot OSes.

Having said that I know one guy in the office who just boots up his Win7 box, fires up an OpenBSD VM and does everything in there, and no-one has any problems.

Virtualisation on the desktop is mostly free. Let people who have a valid business case for Windows use it, in a VM if you're uncomfortable with bare metal Windows support.


it's important to test on the same configuration as production. that said, why not allow virtual machines?


Testing, yes... but laptops?. It is important for QA and or Staging environments to match production, but not developer laptops.


The real value I've gotten from using a vm is that it's easy to replicate across different developer machines. It's quick to setup and all the developers have matching dev environments. I also don't like messing with my base system all that much just to get a web app running.


If it makes sense, I'd have no problem doing it. If I don't see a logical reason behind it, I'd take it as a warning sign that the company wants to control my development OS without cause. If you make it clear in the job posting and you have an actual reason behind it, I don't see any problem.


I think you should let your developers choose which OS they prefer to use, with the clause that they have to make it work them selves and warn them that getting help for Linux is a great deal easier.

I have always been able to pick my own OS in the places I have worked so far. I would be suspicious of a place that tried to mandate any OS, even if it happened to be my preferred choice.


It would be hard for me to work in another system, so yeah - linux is definitely ok. I even prefer to work on whatever works on the live systems, since I'm basically doing devops kind of work (even if it's not actually called like that) - so when we were on centos, I was on centos, after moving to debian, I was on debian...

On the other hand, how good / responsive is your internal IT support? If I'd have to wait for someone to install some packages on my work PC, I'd end up with half of the system recompiled under my $HOME rather soon. It may not be true for other development environments, but I need to test out a lot of new stuff on my own.

However, I wonder why noone mentioned that using laptops as a main dev machine is a weird idea. Low performance, not ergonomic for long work at the desk, weird straight keyboard, small screen... I didn't know any better previously, but right now, I'd give up lots of other stuff for a MS natural and 2 screens (seriously - that's like a complete minimum for a serious developer).


What's to say those laptops aren't supplied with 2 large monitors, a dock, an ergo keyboard and a nice mouse? The advantage of laptops is it gives you the flexibility. You can get beefy laptops.


Well... you certainly could do that. Then again, you're not as flexible, since your desktop gets messed up once you disconnect from 2 screens.

I'm pretty sure that a sensible developer's desktop + sensible developer's laptop cost about the same as one beefy dockable laptop which can handle 2 screens. Seriously beefy laptops tend to be not very portable unfortunately because of the size + weight. (I do not consider laptops above 17'' portable anymore - if it doesn't fit in a backpack, it's too large) Also working on the laptop all the time will kill your battery (or you might risk leaving it out for a long time and discharging it completely) - which breaks the flexibility idea again.

Plus, If I had a desktop and a laptop, I'd just use synergy to have 3 screens available ;) Laptop screens are perfect for having terminals with `tail -F some logs` quickly available ;)


The tools around docking are pretty good. Much better to have that than 2 separate dev environments (laptop and desktop). In terms of bulk, my laptop is a 13" so not very bulky. It could be a bit more powerful but the main problem is the HDD (not SSD) and the awful corporate virus scan.


Honestly, I think of a company like a F1 racing team and the developers as the drivers. It's OK to have policies and processes in place but the devs/drivers are the ones who have to use the equipment the most and will be most in-tune with their nuances. I have always believed in empowering the developers as much as possible because at the end of the day, their productivity is a direct result of their own actions and not the managements'.

I would say have a configuration in place, but if a developer wants a different setup, let them be the ultimate decision maker in the tools they use, even if the management thinks that they have something better.

The only company I've ever worked for (out of 4 web design jobs) that insisted certain hardware/software was an ad agency where the owner had no knowledge of web development but liked apples because they thought they were cool. I had to put up with the first-gen intel macs on OSX for about 9 months before I left.


run your dev environment in a VM


I second this. My team uses whatever they want. We have Windows users, OS X, and Linux. As long as virtualbox or vmware runs on your system, we don't care what the host OS is.


What sort of work do you guys do? Also how bad is the speed hit?

From this: http://superuser.com/questions/146623/performance-impact-of-... it seems like it'd be noticeable (20% I/O hit).


The post you linked to says the most noticeable hit is going to be disk I/O. Does your app do heavy disk I/O? If not, you have nothing to worry about. The CPU hit is barely noticeable in regular use, especially if you have Core2 or i3/5/7 machines.

VMs are fast enough these days. So fast, in fact, that I'm regularly blown away by them even though I use them so much.


Thanks for the info.

We're not concerned about disk I/O but the app does have heavy network I/O. (though latency rather than bandwidth is the main concern).

Also, I assume for virtualization you are talking about the worst case (perf-wise): running Linux in a VM inside another operating system? Rather than the better one of having a hypervisor directly manage both the user's main OS (win/osx) and linux?


> Also, I assume for virtualization you are talking about the worst case (perf-wise): running Linux in a VM inside another operating system? Rather than the better one of having a hypervisor directly manage both the user's main OS (win/osx) and linux?

Yes. I usually run Linux and WinXP on top of OS X and WinXP on top of Linux. Works great!

EDIT: I use VirtualBox OSE. VMWare works even better if you can afford it. Sometimes I use Qemu for trying out toy operating systems, but I find it too slow for actual work.


We do web dev. The pain point for me is having to use rsync to "deploy" to the vm every time I make a change. That's a pretty decent tradeoff though when the alternative is maintaining tons of different development configurations on your host machine.

Our local deploy process is mostly scripted so it's not much of a pain point to have to redeploy new changes.


Is it necessary to sync?

I'm liking the VM ideas people have been talking about here; is it possible to just mount the guest OS' file system on the host (perhaps by running an NFS server in the VM) and develop directly on it?


Yes, but this is highly dependent on the VM app you are running and the OS you are running it on. At a previous job I mounted drives over SSH from a windows host to a VMWare linux.


VirtualBox and VMWare Fusion both allow you to share folders from your host OS to the VMs, you know?


Don't think it's a good idea. OS X works great for all of my development needs, even if the target environment is linux. Also, I tend to believe linux users are especially picky about which distros they prefer.


Yes, I would work for a company like that. As long as this configuration of yours makes sense and isn't terribly onerous. In general, I would prefer to work for a company that developed on linux. But that's because I'm a linux user.

I've worked previously at companies that required development environment standardization on a Windows XP machine in the same way you're discussing. That was a little difficult for me. I'd never developed on XP before, I grew up on Mac and switched to Linux in college. But with some help from my coworkers I got used to it. And the standard configuration made sense. It was the configuration our app was intended to run in. The one our fairly small customer base all used. It wouldn't have made any sense to develop in my preferred environment because the application wouldn't have even run in my preferred environment!

If your program runs live only on a system with your particular configuration, such as with web applications, then it makes perfect sense to have the development environment mimic it. In fact, I would want the development environments to minic it as closely as humanly possible. The testing environments on the other hand - are a different story. But I'll assume you have that ironed out too.


Teams should be allowed to use whatever tools best suit their work style. We stopped the draconian IT practices a year ago and now allow our developers to have literally whatever they want for a laptop. Productivity went up. 2/3's of the team runs Mac, 1/4 runs PC, and the remaining fraction run Linux. Seems Mac is the clear winner in my shop. Interoperability issues are somewhat a part of life, but the good outweighs the bad.


Well i'm a linux guy, use it for desktops and servers and love it, find it a million times easier to deal with than windows or osx, so obviously i'd be fine with a linux based startup, however, i dont think the question is for ready made linuxers.

I've come across this same question from a different angle before, when applying for jobs, at the interview stage i always ask if i'm free to chose my dev environment, the majority of companies are fine with it. I've come across a few that require a windows environment and even some that require eclipse or zend IDE (i prefer gedit) and because of this I have passed on the job whether i have been offered it or not.

While the main reason is because i'm a million times more productive on my setup of choice but its not the only reason, its also a red flag that the company's culture is rigid and probably a shitty place to work.

Obviously there are exceptions, but i think the best rule of thumb is to let the developer choose their own tools, set them a task and get out their way.


As an experienced developer I would say don't restrict the team to a single OS. Using different OS's (even different flavours of) forces you to write good quality platform neutral code which can be a great benefit if you should need to switch your deployment platform. Different compilers will expose different compile time bugs and different OSs will expose different runtime bugs. Plus there are no platforms that have ALL the best tools a developer needs.

If you go multi OS you will need a better class of developer though. It's easy to find developers that have no time for cross platform issues believing their platform is king and others are just porting their code. In my experience they are Windows developers with no experience outside of Windows but I'm sure it's a cross platform issue.

If you don't think this will help your business... stuff it, go linux only. Any good developer can develop on ANY platform.

Good luck, I imagine, in the big picture, this is the least of your worries.


As a mac user I'd say that I'm more productive using Mac OS X, but that's not to say I wouldn't work somewhere that used Linux.

We have this identical issue and what works very well for us is to use local virtual machines (we use VMWare) because then we can use exactly the same configuration, independent of the development platform. For example, we have Mac and Linux users, but could also easily accommodate Windows users as well. Other benefits include the fact that you can really lock down the VMs to be exactly as production, whereas it's unlikely that a local dev machine would be the same (Does your production machine have Firefox installed?)

Another benefit is that when we change the configuration of production (which is all managed with Puppet) we can just update the dev machines because they were built with Puppet as well, without worrying about screwing up someone's desktop machine.

Plus, you can take snapshots... You can rebuild them easily... etc

Lots of benefits.


I wonder how they'd respond to Linux installed on Macbook Pro hardware rather than thinkpads.

It wouldn't bother me, but I agree with others who suggested that a VM approach would be preferable if available. Or, what about a Macbook Pro in dual boot config with vmware to boot up the other partition from OSX for non performance critical stuff?


Good point on the hardware/software mix. Does the Mac have much value over Thinkpads without the OS though?


I've been using Linux for many years now exactly because of that, same OS on the server as on my dev box. I literally never looked back, in the beginning I used VMWare as an occasional crutch but I haven't fired up a VM in about a year now.

It's not only doable, it's probably a really good solution.

If a developer is so attached to any one species of window manager that they can't switch you have a problem anyway.

The initial productivity hit was definitely visible, but it did not last more than a few weeks.

I also have a mac but hardly use it (it got borrowed to a friend because of that), all three of Mac, Windows and Linux are viable environments these days and a competent developer should not have too much trouble migrating.

Even though OS/X is favorite amongst many developers it actually is the 'odd one out', but if you intend to develop for apples mobile platform you might end up with a few of them anyway.


> in the beginning I used VMWare as an occasional crutch but I haven't fired up a VM in about a year now.

Out of curiosity, did you use the free Vmware player or the paid Vmware products?

I have been using Ubuntu as my personal and work system for sometime now but there are situations when running Vmware would be beneficial. I see that the license prohibits commercial use and I think dev work would be commercial use.


The free one (I'm cheap ;) ).

It seemed to do everything I needed it to do.

I've used virtualbox as well for a bit.


All of our systems (development, live, etc.) run Linux with a relatively complex configuration

Depends on said complexity.

You listed three notes above. (A) is odd. For (B), similar packages should be obtainable unless it's a proprietary package for your startup. (C) Encryption is pretty well supported on OS X and Linux. Scratch Windows.

I'm sure some devs would welcome a new laptop. However, being restricted to just that without good reason will strike most as odd. Trust your devs to be able to setup their own dev environment. Allow other environments if they can make it work.

At the end of the day, it seems like the underlying message is what counts. Does it appear that you are encouraging or dictating ?


Thanks for the info.

On the points: A: This may be less necessary than I believe admittedly. I've only worked at companies with 100+ people before, all who gave similar hardware. The advantages were in the ability to swap computers. If your computer dies, you get a spare immediately (you can even swap the HD!) as repairs occur. If someone leaves, the computer can be given to the next hire (or an intern). B: Our dependencies are all open source. The full system though (needed for testing) uses kernel modules which don't have a (fully working) mac port. C: That's good to know. (hopefully easy on OS X -- no one currently in the company uses it).

Finally, again in the companies I worked, it was pretty much expected that company development stayed on company equipment, mostly for security issues. Is it pretty common in companies with < 20 people for devs to bring their own machines?


I see where you are getting at.

With A: Standardization is good but it probably takes at most a day to setup a new computer. It takes weeks to learn and master a new environment though. Productivity is probably paramount in a small company more than standardization.

B: Ah, but if your dev could get it working on a mac, would it be ok? If not why? How about BSD, or some other linux distro different from your intended setup?

C: As someone else mentioned elsewhere, enforcing restrictions is easier said than done. The key question to a dev is why? In some ways that just discourages devs away from working on the project as much as they like.

The key point here is that in a small <20 company, you want to attract the best talent that's out there. Someone who's at the top of their game. Most likely they will have their own machine with their own setup. A % of them will likely stick to their own setup instead of using two computers.

You may employ contractors too. That makes it even more likely they prefer their own machine.

Being offered a choice, i.e. You can use your own computer, and here are some things you need to do too (encryption, etc) or here's a brand new one all setup is infinitely better than You must use this only


Our infrastructure is also totally Linux based. But we don't mandate what the developers use. Personally, I prefer to use a beefy desktop machine with multiple screens instead of a notebook. When it comes to hitting the road, I prefer the MacBook.

Please consider the many suggestions for using virtualisation and the suggestions of nazism. Personally, if a developer was fantastic and preferred to use Windows, then as long as he observed security requirements it would be all systems go.

Anyway, most software that runs on Linux is also ported to OS/X and Windows (support levels may differ) so perhaps setting up your dev env on other platforms could be instructive.


For the past 10 I've _only_ used linux as my desktop. I was using it pretty frequently before then too.

So my answer would be: If your potential employee refuses to work on Linux, or Windows, or OSX or NeXT or whatever... you probably don't want them.


Blocking system configuration is not nice in that it limits freedom, but I think an employee must take responsibility when his tools don't work as expected and fix it at his own expense or just not take the chances and use the standard system. We had one employee who got distracted into his own geeky pursuits and changed his tool settings frequently with no apparent benefit to the development roadmap. In conclusion I think you have a valid right to set certain standards, and may provide freedom to change but at the developers responsibility, and take measures if things are inclined to get out of control.


I stopped working for a company because they forced me to use MS Visual tools.

I could work with MS tools, but I want to be able to choose it if appropriate, I want to be the one who takes the decision, not someone that hardly ever codes a line.

BTW, I program mainly on MacOS and Linux, and every piece of software has to be multiplatform (work on windows too). The code is way more robust if you force yourself to compile in several platforms, you discover bugs early with very little work.

In my experience working on only one platform means letting bugs grow that will byte you in the worst moment later. For me is penny wise, dollar foolish strategy.


I think companies could mandate what OSes their servers use and standardize some of the software that the development team uses, but to require an OS is pushing it. I understand that many non-technical companies do this, however operating systems tend to be a personal preference. Some people are more efficient on Windows, other on a Mac and others on their Linux distro of choice. If you have to use one OS for work and another at home it just creates a productivity conflict. Big "Take this OS and Shove It" from here.


http://vagrantup.com/

Others have suggested development in a vm. Vagrant makes things incredibly painless (including lifting your workspace in the vm).


If i can customize the Linux to fit my working style, i'd love to. I'm using Linux for work anyway. With a customized Linux i think neither Mac nor Windows can beat the productivity.


I already use Linux on a T61, as does two other people in the room I am in. Add two others on x200 and the only guy who doesn't uses a siemens computer.

I doubt Apple is as widely used as you think.


VirtualBox, VirtualBox, VirtualBox.


Your developers need to be able to work how they're comfortable. Ultimately, setting up their development environment makes them more comfortable with it.

That said, give'em a VM, let them use anything they want, and if they want they can attempt to replicate the VM onto their machine. It's us to them to become competant, not you to say "Well you're probably NOT, so no soup for you!"


No, because FreeBSD FTW!

Seriously, as long as the development environment allows me to get things done, I'm not sure I care too much about the underlying OS.

Here everyone is free to pick the laptop OS they want. For the moment it's 100 % Windows 7 + VMWare. Our software is multiplatform (Windows/Linux/FreeBSD).

I submit Windows 7 will keep the high ground for developers because of VS2010.


I'm a Mac user, and I probably wouldn't like it terribly but since most of my non-Xcode work is with vim and Terminal I would probably be ok (I can get by with web-based services for everything else).

I would definitely miss things like Homebrew though: http://github.com/mxcl/homebrew


As a long time GNU+Linux user, I would be fine with being forced to use Linux or FreeBSD. As a matter of general principal, however, I am opposed to such OS requirements when they extend beyond reasonable security and comparability concerns. I wouldn't work for a company that required employees to use Windows, OSX or Solaris.


Linux is a plus in general. However, in my experience, standard custom development configurations always suck - my experience is for Eclipse configurations, but I am willing to bet the same holds for Linux or any other predefined and locked down development environment.


Personally, I'd like to know why a start _wouldn't_ use Linux without a particularly good reason (e.g. developing for Windows or Mac OS X). I use Linux by preference, both when I'm contract and when I'm working on my own code & products.


I develop on a windows platform and try to use IE as much as possible because my job involves being in charge of all of the cross-compatibility issues with different browsers. The only way to truly be an expert on this is to actually use the different browsers on a regular basis since the rendering issues change with every update.

It kinda sucks, but for front-end devs/designers like me it's a necessary evil.


Running Linux on Thinkpad is the ideal scenario for me. Though I would like to have a MacBook also, I think I would not prefer it for work.

By the way, I have been drooling over both of them in the past week. Really want one of them :)


I develop at a Linux only shop.

I would use whatever OS the job requires. Most jobs [that provide hardware ] don't give you a choice on OS.

If I was a .NET developer, my boss would not let me be the only guy using Mono, while every one else used VS.


I would work for a startup that requires Linux on the desktop but only because I'm a fan (I also always buy Thinkpads). If you required me to work on OSX/Windows, as petty as it sounds, I wouldn't be so keen.


I already use Linux. So to answer the reverse question: I'd definitely be comfortable coding for work on a Mac, especially if I was allowed to make caps lock an additional escape, use my own .zshrc, etc.


Nonsense - everyone knows that the only proper remapping for caps lock is CTRL!

...oh, wait, I get it. You're one of those, aren't you? :P

Editor holy wars aside, on OS X it doesn't appear that you're even able to remap caps lock to escape, at least without some additional add-ons installed. So you might be out of luck.


I actually don't use vim much any more, but I'm still used to caps lock as an additional escape (er actually, I think I have caps lock and escaped swapped). It's not a big deal.


I already use Linux on a Thinkpad so it's not much of a leap for me (I have a pretty esoteric setup, though) but I'd be cool with a "required" setup if you're providing the hardware.


For me, standardization would be huge turn-off (not Linux per-se). My computer / OS choice is very personal thing, just like my clothes, spouse etc.


I'd love to work somewhere linux was the standard.

I will not however, work anywhere that requires me to use a mac. I did my time on one for a job once, never again.


Most of my career I've been forced to work on Windows, so I could live with it if everything else was ok.


Hell yes! Last place I worked forced me to use windows for writing PHP and java code for android.... :(


Absolutely, and indeed in the past I have worked for startups who were nearly entirely Linux based.


I wouldn't work somewhere that does not allow me to use Linux on my development machine.


I doubt I'd work at a startup that didn't require using Linux!


Does your live system also run on that particular model of thinkpad? Are you using HD encryption that only works on this model? If not, what's the point? I wouldn't work at the place like this: to me it smells like control freak at work, not some sensible requirement. If I am going to spend at least 8 hours a day in front of something, it better be something I really like and am comfortable with. Which happens to be Mac and OS X. I love linux on servers and would tolerate it on dedktop for work if really needed, but forcing particular make on ne—no, thanks. And how about virtualbox with some standart image?


To answer the title question: a hundred times yes!

I'm sick of all the start-ups thinking that they need to get money from investors to buy Apple computers, and make great apps in Ruby on Rails.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: