Hacker News new | past | comments | ask | show | jobs | submit login

Yep, I did software development in academia for a biochem lab and they paid less than 1/3 what I make in industry. Not only that: I was lower on the totem pole than a first semester PhD student, there was zero potential for career growth of any sort (the prof I worked for laughed out loud when I asked about it), and my job security was entirely governed by the grant approval/extension whims of the NSF and NIH.

Foreknowledge of all that wasn't enough to keep me from working at the job for a while. It was a super interesting experience, and I learned an enormous amount about biochem, comp bio, synthetic bio, and several other fascinating subjects.

What eventually caused me to leave was the continuous, losing battle for sane software development practices. It wasn't just that lab: everyone I encountered in the techy side of bio - save for the oddball comp bio or synth bio prof/student with a CS background that included industry experience - was completely adverse to treating their software as anything other than a means to an end. In the year and a half I lasted before taking a job in industry, that one lab easily wasted hundreds of work hours navigating easily preventable tech debt, writing the exact same code for the Nth time, fixing the same deployment or revision control mistakes for the Nth time, etc, while any attempt on my part to put in standards and practices to alleviate any of it was dismissed out of hand as a waste of time.

In short, I agree that there's a shortage of software engineering knowledge and skills in the field, but beyond the obvious financial, organizational, and career development hurdles keeping talent away, there's a major attitude adjustment required by the researchers themselves.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: