Hacker Newsnew | past | comments | ask | show | jobs | submit | more RK's commentslogin

My best friend had one of these as a kid and we made epic backyard GI Joe movies.

Incidentally he now works as a film editor.


I'm under the impression that it has only gotten worse.

Anecdote: A super smart friend of mine from physics grad school finished in 2008 and went on to post-docs at Caltech and Harvard. He was unable to get a tenure track job, primarily because in that 4 year timespan only 1 faculty position opened up in the entire US for his (tiny) field. He now works at Google.

In fact I'm not sure that anyone from my cohort is still in academia. Other students used to come ask me questions all of the time, because I had worked in industry before going to grad school. Their questions were always some variation of "what are my options if I get out of physics?".

Edit: I'll also add that most grad students I knew gave little thought to their post-grad job prospects before starting grad school. It seems now that the message has trickled along a little better that the outlook is very poor.


I interviewed recently for a Data Engineer position in Chicago and was surprised when 2 interviewers from the data team introduced themselves as astrophysicists. They reasoned that they weren't happy with the career path in academia and so found a way out.


It's becoming more common as far as I can tell - I also left academia (mathematics) since academia is less about the work you do now. My brother stuck it out and finished his PhD in chemistry, but he too bolted from academia since he hated it as well (and that is with the fortune of having a high reputation thesis advisor, but who milked him for as many papers as he could before my brother threatened to just up and leave) - he's now a senior engineer at Samsung.

The academic landscape is just poor these days - it is a pretty brutal world to operate in.


I'd just like to emphasize:

while most people would agree that we train too many PhDs, I can assure you: the process of training many PhDs (in any discipline) has been very, very good to Google. It selects for, and hones the skills of, people who are quantitative, can form hypotheses, and test them.

I went to Goog- and took a significant temporary hit to my bioinformatics career- while working as a software engineer on stuff that was far from science. However, I can assure you: my training could be easily transferred.

Looking at the alumni list for my program, http://biophysics.ucsf.edu/people/alumni I see a very wide range of outcomes- yeah, a few professors, but also SVPs of companies, venture capitalists, doctors, software engineers, patent counsel, and industrial scientists. You can also see the postdoc bolus.

Another way to put it: going to school is not the fast or easy path.


I'm seeing more and more bioinformatics-related positions in industry that are listing a PhD as a requirement. I'm currently trying to decide if it's worth it to do the PhD now (in Bioinfo or a strongly related program) or to side-track for a little while doing something more directly computational or entrepreneurial. I'm ultimately interested in coming back to bioinformatics regardless. Would you say it's worthwhile to do the PhD (even if intending to work in industry), or to try to make up for the lack of a PhD with other experience?


The older, safer part of my says get the PhD. The younger, more risk prone part (which I ignored for too long) suggests you're going to make a bigger impact sooner by doing something computational and entrepenurial.

The PhD means, when you get to industry, you won't end up as a lab tech. it doesn't mean that you will be running a lab, however.


Do you have a write-up of your answers lying around somewhere? I'd be interested.. (half a year to go on a Master's)


No write-up, but people I know have left physics for:

* Quantitative finance

* Insurance (working on models that were beyond what the actuaries were trained to do)

* Data science

* Scientific equipment R&D

* Scientific equipment sales

* Popular science writing (this is a bit of an outlier!)

* Defense contracting (engineering, "scientific" programming, etc)

* Programming

* Door-to-door insurance sales (no joke)

I usually tell people to brush up on their programming skills as much as possible. If you're a theorist who only does pencil + paper or maybe Mathematica, it might be hard to find a decent job. Also, it never hurts to talk to / network with any industry people that are related to your field (software or hardware vendors, etc).

I'm now doing data science, but have also done hardware development and electrical engineering related things (signals). When I was transitioning to data science I also did some very specialized consulting related to my PhD. A few people paid me to do simulations and/or help then implement some techniques that I worked on as a grad student.

Also: I had a very supportive advisor, who encouraged me to accept a job offer before I graduated.


You could make a plot like Figure 1. Look for the turning point (do some calculus if you can, i.e. d(perf)/d(dim) = 0).


Derivatives require continuity. It would be sufficient to simply look at which number of dimensions gave you the best cross-validated classification rate.


Alas, it rarely looks clean like that. Actually never, in my experience, for real problems.


Los Alamos is in the middle of nowhere, but Sandia is smack in the middle of a metro area of ~1M people (Albuquerque).


Right, sorry! I tend to forget they're not literally next to each other since they're both nuclear security labs and in the same-ish area.


I'd like some UI element that lets me know how long it's been since I read/used a tab. I tend to open lots of tabs and then browse them as is convenient later. Sometimes this means tabs get "lost" (i.e. become stale before I read them). This would let me know which tabs I might have overlooked or should close, bookmark, etc.


Many Irish already had their names Anglicized by the English before they left Ireland.


I'm not that familiar with military aircraft development timelines, but this stuck out as my favorite part:

A hypersonic plane does not have to be an expensive, distant possibility. In fact, an SR-72 could be operational by 2030.


By comparison, the SR-71 was first proposed around 1960 and operational by 1966. http://en.m.wikipedia.org/wiki/SR-71_Blackbird


This is one of the most perplexing things about our modern world - the seemingly exponential increase in the length of time it takes to get anything done, especially large, technically complex projects. This is true for the Saturn V, for the World Trade Center, dams, highways, high speed rail, etc. It makes me wonder where these "productivity improvements" really are, especially the ones that computers presumably give us. Perhaps AutoCAD and MS Word and email make us feel more productive, but are actually slowing us down. Or perhaps all of it just increases the velocity of money through the economy, which particularly helps those whose income is proportional to transaction rate (banks, brokers, and governments).


That's a good question, I think, thought I'm not sure, the answer is people are now far more expensive, adding more people means less money is made. Let's use your dams example. They built a bunch of big dams in the 50s and 60s and each one had more than 10,000 people working on it. They were mostly publicly financed, so more or less not built to make money. They didn't worry about adding a few more (or a few hundred more) people to the job because they were cheap. Now, the people that do this work are really expensive, and adding a few more means the people in charge make less money. Another big thing is probably rules and regulations and laws, far fewer back then. Good? Bad? I dunno.


I don't think it's the cost of people. Technology has done a pretty good job of reducing the cost of simple labor. But technology hasn't done a great job of managing complexity. See, for example, the above jab at the "millions of lines of software code" managing the SR-72. That stands in stark contrast to the "keep it simple stupid" motto attributed to Kelly Johnson, the SR-71's designer.


Progress!


Nice reference.

1.29 happened to be exactly what I was looking for:

  for subset in itertools.chain(*(itertools.combinations(a, n) for n in range(len(a) + 1)))
I spent way too much time writing a function to come up with these combinations.


You can also do

    itertools.chain.from_iterable(itertools.combinations(a, n) for n in range(len(a) + 1))


I just checked put my sites using elinks and fortunately they came out pretty well.

One is based on bootstrap and one on skeleton. I just made some links that are icons (github, twitter, etc) visible in elinks by adding img tags with src="" alt="some text" style="display:none". Seems like an OK hack.


On the other hand Mel Gibson was Aussie and became American. He's not claimed anymore.

According to Wikipedia, Mel Gibson was born in the US to American parents and moved to Australia when he was 12. Somehow I remembered that factoid when I read your comment :)


Thanks. I didn't fill in that detail. I thought only his mother was American, and I should always check my facts before I post anything here. So in, and then out, but oddly, out in a similar way that Rupert Murdoch is out.

Pride and Prejudice.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: